[Bug] File Name Isn't Changed When Decompressing File
When using the copy activity to upload files from an on-prem file system to blob storage and decompress the files en-route by following the instructions given at https://docs.microsoft.com/en-us/azure/data-factory/supported-file-formats-and-compression-codecs#compression-support the extension of the file in blob storage isn't correct if the file extension of the source file is uppercase.
If I upload a file called 1234.GZ and set it up to decompress I get 1234.GZ in blob storage.
If I upload a file called 1234.gz and set it up to decompress I get 1234 in blob storage.
It seems that the file extension substitution is case sensitive. Can this be changed so it replaces the file extension regardless of casing?
Gopinath S commented
I'm generating a csv file with compression after decompress there is no file extension its empty
Actual results: abc.gzip --> abc
My expectation : abc.gzip --> abc.csv
Joyce Xu commented
Hi Ben, thank you for reporting this issue to Microsoft ADF team. This is a decompression phase bug which causes that the file extension case insensitive issue is not well handled. The issue will be fixed in next ADF release.