Update: Microsoft will be moving away from UserVoice sites on a product-by-product basis throughout the 2021 calendar year. We will leverage 1st party solutions for customer feedback. Learn more here.

Storage

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. more detailed error messages

    We've run into an issue where data we tried to store wouldn't fit in the blob file. The error message was (416) The page range specified is invalid.

    At first it wasn't clear whether we were trying to start saving beyond the page file or data didn't fit or something else was culprit. Eventually we figured it out.

    It would be nice if the error message would provide more details: page size, size we tried to save, location we tried to save within the file, etc. This would've showed us the issue right away. Instead we had to spend a…

    12 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →

    Thank you for your feedback. We are currently working on providing this functionality and will provide updates when they become available. See the following article for the latest: https://docs.microsoft.com/en-us/rest/api/storageservices/status-and-error-codes2. Note that for REST API version 2017-07-29 and later, failed API operations also return the storage error code string in a response header. For any further questions, or to discuss your specific scenario, send us an email at azurestoragefeedback@microsoft.com.

  2. Support for JSON format at API Responses

    Enable JSON format support at blob storage API HTTP response.

    Example, instead of:

    HTTP/1.1 404 The specified resource does not exist.
    Content-Length: 223
    Content-Type: application/xml
    Server: Blob Service Version 1.0 Microsoft-HTTPAPI/2.0
    x-ms-request-id: dbe27af2-701e-0045-0939-c25bc5000000
    Date: Tue, 24 Nov 2020 08:10:19 GMT

    <?xml version="1.0" encoding="utf-8"?><Error><Code>ResourceNotFound</Code><Message>The specified resource does not exist.
    RequestId:dbe27af2-701e-0045-0939-c25bc5000000
    Time:2020-11-24T08:10:20.8064839Z</Message></Error>

    Using a querystring option or using a "x-ms" specific header to obtain:

    {

    &quot;Error&quot;:{
    
    &quot;Code&quot;:&quot;ResourceNotFound&quot;,
    &quot;Message&quot;:&quot;The specified resource does not exist. RequestId:dbe27af2-701e-0045-0939-c25bc5000000 Time:2020-11-24T08:10:20.8064839Z&quot;
    }

    }

    12 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →
  3. 12 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →
  4. 11 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →
  5. Allow Paas Services to connect to firewall protected storage as "Trusted Services"

    Currently only very few Azure PaaS services can connect to a firewall protected Storage Account, but practically all data integration tools (Data Factory, Data Lake Analytics, Databricks, Azure DW Polybase, Logic Apps, Azure Functions etc). are not considered among these "Trusted Services".

    More and more of our Azure customers are complaining about this missing feature as alternative methods are clunky or not available, depending on the scenario.

    There are multiple ideas regarding the same issue, each for specific service. But I see the problem as more foundational, it is not a question of which PaaS services should be included, more…

    11 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →
  6. SAS behavior for replacing/overwriting blobs

    It would be useful to define in the SAS token what to do in case of creating a blob with an existing name. The current behavior replaces/overwrites the existing blob.
    Furthermore the default behavior is somehow a security threat, because a hacker having a SAS could replace existing entries.

    11 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →
  7. Change error "specified blob or block content is invalid" to something more informative or fix underlying issue

    Azure Government] I have two MSI files. I can upload chefdk-1.2.22-1-x86.msi, 324 MB, just fine via the port. The other one, azure-powershell.3.6.0.0.msi, is 37MB.

    When I upload via the portal I get a pop-up reading:

    Upload error for azure-powershell.3.6.0.0.msi 7:08 PM
    Upload block blob to blob store failed: Details. StatusCode=400, StatusText = The specified blob or block content is invalid

    When I attempt to upload via witih azure-cli, I just get 'Connection Aborted' after about 30 seconds.

    Azure-cli also fails after I rename the file, and if I try to upload to a different container on this account.

    The portal…

    10 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →
  8. Support Smaller Reserved Capacity like 10TB

    I want smaller reserved capacity like 10TB.
    100TB is too big for me.

    10 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →
  9. CORS Rule

    Currently we are able to enable CORS with exact domain(ex: http://www.abc.com) or all origin (""). can you provide feature for supporting all sub domains say http://.abc.com so that CORS will enabled for all sub domains of abc.com and restricted for all other domains.

    10 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →
  10. Add Page Blobs to the new Storage Reserved Capacity

    Base on the documentation, seems like the new Storage Reserved Capacity only supports Block Blobs, this excludes the ability to do reservations of storage for 2 of the most important storage scenarios which use Page Blobs:


    • Virtual Machine data drives.


    • SQL Database files hosted in Azure Storage Accounts.


    Please add support reservation of Page Blob Storage.

    9 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →
  11. Implement viewer for json / csv blob files

    At the moment the only way to view a files in storage blobs is to download it and open it within an editor. This is fine for larger files, but for the smaller ones it would be a bit easier to be able to get a quick (pre)view of the file in the browser by just clicking it. Should not be that hard to implement an additional pane with a view of the raw file contents.

    9 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →
  12. create an API method for getting the number of BLOBs in a Container

    As far as I understand, there is currently no method for getting the total number of blobs in a single container.

    And there is also the maximum number of blobs per request (5.000).

    We have containers with 10k - 100k blobs so we would really need a way of quickly checking the number of blobs that are in a container.

    9 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →

    Our apologies for not updating this ask earlier. Storage analytics at the container level is on our backlog, but is unlikely to be released in the coming year. Currently, account level analytics are available. In order to monitor, diagnose and troubleshoot an Azure Storage account, please see https://docs.microsoft.com/en-us/azure/storage/storage-monitoring-diagnosing-troubleshooting. In .Net you can now use the Analytics class to process these metrics. The API can be referenced at https://msdn.microsoft.com/en-us/library/azure/microsoft.windowsazure.storage.analytics.aspx. For any further questions, or to discuss your specific scenario, send us an email at azurestoragefeedback@microsoft.com.

  13. SAS At Folder level

    Hello Team,

    Could you please provide an option to generate SAS tokens at Folder level within blob container.

    E.g. I have blob container called 'test', Inside it have the following files/ Blobs.

    FolderA/PathA/file.txt
    FolderA/PathA/file2.txt
    FolderA/PathA/file3.txt
    FolderB/PathB/file.txt

    We want to generate SAS token such that the user can perform operations inside 'FolderA' only inside 'test' container.

    8 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →
  14. Specify access tier while copying files in Data Factory

    Could you make it possible to specify access tier(hot,cold,archive) while copying files from storage account to another in Azure Data Factory?

    https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-storage-tiers?tabs=azure-portal

    8 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →
  15. Blob Storage :: Enable filtration in listing APIs

    The Azure Blob Storage service is intended to be scalable, but the APIs do not enable filtration of results. The Azure Blob Storage service should enable filtration of container and blob listing at the service fabric layer.

    7 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →
  16. I want set Cache-Control as an option, when Azure File Copy task

    I read this article about static website hosting using a blob storage account version 2 in Azure
    - https://azure.microsoft.com/da-dk/blog/azure-storage-static-web-hosting-public-preview/

    So, I updated my Azure File Copy deployment Azure DevOps's task to target a newly created Storage Account (GPv2) which contains a $web folder.
    So, in my Azure File Copy task I updated to version 2.* (Preview) and entered $web in the Container Name.
    But, we can not currently set Cache-Control.
    So, the cache can not be cleared.
    The latest content will not be displayed, because I am using Azure CDN.
    I want you to add the function as an option.

    7 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →
  17. Make it possible to set the time frame for running Azure life cycle management policies

    According to this page, https://docs.microsoft.com/sv-se/azure/storage/blobs/storage-lifecycle-management-concepts, it tells that the policy for block blob runs only once every day. Can we set the time to run these policies, so that we have better control over them? This might be useful for scenarios, for example: when we need to run archiving or cleaning up job during hours where business load is less.

    7 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →
  18. Allow AAD App Identity and Role Assignments in Blob storage

    2019-03-04T04:47:40.513 [Error] Cannot create Shared Access Signature unless Account Key credentials are used.

    System-assigned App Id allows me to access Key Vault when running in deployed app instance. The App code simply creates a KeyVault client using the vault Url. No seperate code is needed. Permissions are handled by Azure under the hood.

    Why can't I do this for the same app accessing Blob storage?

    7 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →
  19. DELETE https://myaccount.blob.core.windows.net/mycontainer?comp=list&prefix=foo

    The same way I can list blobs with a certain prefix in a blob container using http-method GET I would like to issue a http DELETE on the same combination container + prefix and effectively delete all blobs matching the prefix in the container!

    6 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →

    Thank you for your feedback. We think of this ask as “range delete”, and plan to deliver it after “batch delete”. Batch delete will allow you to enumerate multiple blobs to be deleted in a single HTTP request. Batch delete is planned for the coming year, but there is not an ETA we can share for range delete. We will provide updates when they become available. For any further questions, or to discuss your specific scenario, send us an email at azurestoragefeedback@microsoft.com.

  20. Gzip encoding for 404 document in static website hosting

    I am hosting a vue SPA through static website hosting feature offered by Azure Blob storage. My files are gzip encoded and I add the respective header value when uploading. The site works fine!
    However, When I try to manually hit a URL (say, xyz.com/fgt) or an invalid URL (say, xyz.com/errorpage), it serves me the gzip gibberish instead of an HTML file. This is a behavior that you see when "Content-encoding: gzip" is not added to your HTML file but I do during my deployment. That's how I am able to see my site in the first place. However, if…

    6 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Blobs  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Feedback and Knowledge Base