How can we improve Azure Storage?

HTTP compression support for Azure Storage Services (via Accept-Encoding/Content-Encoding fields)

Add support for 'Accept-Encoding' and 'Content-Encoding' fields in the HTTP request in Azure Storage Services with supported compression schemes gzip and deflate. This should helps reduce network payload by 10x-50x in some cases like saving/loading bulk records to Azure Table Storage.

216 votes
Vote
Sign in
(thinking…)
Sign in with: Microsoft
Signed in as (Sign out)
You have left! (?) (thinking…)
Ilya Grebnov shared this idea  ·   ·  Flag idea as inappropriate…  ·  Admin →

15 comments

Sign in
(thinking…)
Sign in with: Microsoft
Signed in as (Sign out)
Submitting...
  • Anonymous commented  ·   ·  Flag as inappropriate

    This can be done using a logic app to watch on a container for any added blob than un-zip that blob to another container. There is however a 50MB limit on this operation in logic apps.

  • Anonymous commented  ·   ·  Flag as inappropriate

    We seriously need this feature. We are a new customer and have 78k PDF's to upload. Uploading a zip and having it decompress in the blob service would be ideal.

  • Patrick Jonas commented  ·   ·  Flag as inappropriate

    Thank you and the team support you get time and effort to committee to the networking you have do great. to keep me sign in to my account.

  • HS commented  ·   ·  Flag as inappropriate

    +1 This would be a great feature. It would help in onboarding new customers to Azure. One of the big hassles is uploading huge VHD files. If we can zip them, upload them and unzip them directly onto the storage account, it will save a lot of time.

  • Will Hancock commented  ·   ·  Flag as inappropriate

    Yeh this is an awesome feature of S3. If you have lots of files maybe 1000's of small tiny images etc.

    I think on that bases you should revisit... or we'll look at advising our client to AWS.

  • Poojan Kothari commented  ·   ·  Flag as inappropriate

    I have a scenario where clients upload password protected zip files onto azure blob, and i want to process it onto blob itself to get inputstreams out of it ? any suggestions.

  • Justin Chase commented  ·   ·  Flag as inappropriate

    AWS supports this when creating lambda functions by the way. You upload a zip to S3 then point lambda at that zip, it unpacks it on the file storage for the machine. Very fast compared to uploading each file independently.

  • Paul Smullen commented  ·   ·  Flag as inappropriate

    This would be a great feature. An API command to tell Azure to Unzip to a given location. Right now, if a large Zip (e.g. 1GB) has been uploaded, I need to download it, Unzip it and re-upload it. Very ineffciient...

  • Bart Czernicki commented  ·   ·  Flag as inappropriate

    This is huge...inserting batches of 100 causes large chatter/responses from Azure Table Storage. This is on inserts up to 500 meg/hour in bandwidth, not a huge price concern but still a lot.

    Another alternative is to raise the batch limit to 1,000 or something for transferring large amounts of data. 100 is not enough.

  • Shawn Weisfeld commented  ·   ·  Flag as inappropriate

    I have to upload 2.2 billion small (10kb each) images, I built something like this using a worker role, however it would be great if it was built into the framework.

  • Charles Lamanna commented  ·   ·  Flag as inappropriate

    This would decrease our network costs 10x+ -- an amount that starts to add up since we transfer a fairly large amount of data given our size.

  • che commented  ·   ·  Flag as inappropriate

    Azure blob storage is great for hosting lots of small files like those needed to support DeepZoom or Pivot. However, uploading thousands of small files one-by-one is very slow. It would be great if I could upload a zip file and have Azure unzip it into blob storage. Zip compression might also help reduce the upload bandwidth in certain situations.

Feedback and Knowledge Base