How can we improve Azure Storage?

Add a way to copy the contents of an entire storage account to another storage account

Say your application stores important data in an Azure Storage Account, perhaps spread across both Table and Blob containers.

It would be very handy to be able to copy the entire contents of a storage account to another storage account. So that you could duplicate your azure storage environment between your stage production environments easily, allowing you to test updates to your application on a copy of your live production data before actually deploying it to production.

I know that bits and pieces of this functionality exist already, via things like Copy Blobs and Blob Snapshots, but nothing built in to Azure exists for copying table storage.

163 votes
Sign in
Sign in with: Microsoft
Signed in as (Sign out)
You have left! (?) (thinking…)
Ray shared this idea  ·   ·  Flag idea as inappropriate…  ·  Admin →

Thank you for your feedback. We would like to get more feedback about this item before we prioritize. Is the ask here to replicate a storage account on-demand ? Or have a mechanism to always keep two or more storage accounts in sync ?

Meanwhile, to unblock some of our users we have published a sample that copies blobs from an account to another account:


Sign in
Sign in with: Microsoft
Signed in as (Sign out)
  • Bassem Mohsen commented  ·   ·  Flag as inappropriate

    What we need is some way to take "snapshots" of the storage account at certain points in time. Either automatically or on demand.

    Syncing is Not what we are looking for. Because with syncing, the mistake will propagate which defeats the purpose of having a backup.

  • Andrew commented  ·   ·  Flag as inappropriate

    Need this for creating scheduled copies of a production storage account for staging purposes.

  • Reed Robison commented  ·   ·  Flag as inappropriate

    This would be really handy for both backup and testing purposes. Suppose I want to test a new release against a copy of my production storage data, I'd like to just snapshot the entire thing into a test account without having to copy blog and table data manually into a new account. Being able to simply clone a storage account would also be really handy for backup purposes if I wanted to copy off a production account at a given time for quick restoration at a later time. I know we can do this manually.

  • Bassem Mohsen commented  ·   ·  Flag as inappropriate

    The PowerShell script suggested above by Microsoft Azure Storage Team did not work for me because I was trying to copy a large number of blobs (about 300,000). The script was so slow and I was getting many timeout errors.

  • Randy Coy commented  ·   ·  Flag as inappropriate

    We need an on demand solution for occasional testing scenarios. In our experience azcopy via powershell scripts is a very slow (and frequently fails) on storage accounts with a significant number of containers.

  • Anonymous commented  ·   ·  Flag as inappropriate

    We have a scenario where in DR scenario, we need to immediately have read-write ability in the secondary region. With existing RA-GRS account the secondary region copy is read-only, therefore making it a dependency on Microsoft to do the fail-over. In order to control the fail-over, a copy of entire contents to another storage account would help so this feature is important to us.

  • Maximilian Wurm commented  ·   ·  Flag as inappropriate

    Usecases I would need it for:
    -Move data from the old storages to the new deployment model based on ressource explorer.
    -Copy data into the storage of an Azure Stack

  • Ben Prejean commented  ·   ·  Flag as inappropriate

    For our use-case we are interested in cloning an existing storage account as part of a CI environment build out / refresh

  • Chris Cox commented  ·   ·  Flag as inappropriate

    AzCopy takes so long to backup a storage account that it is useless to me. I need a way to more quickly backup an entire storage account, primarily for isolated backup. If a developer were to accidentally delete a blob container, I want a quick way to restore that from a backup copy.

  • Andrew commented  ·   ·  Flag as inappropriate

    We would live this for our Release process. Making changes to table storage is a one way process with no way to revert if you make a mistake.
    Also being able to backup ones table and blob storage is a must for us. Using AZ copy works but takes ages and is clunky at best. It also relies on someone doing it right. Having a button in the portal would be great.

  • Alex Bulahov commented  ·   ·  Flag as inappropriate

    We need to copy storage completely on-demand. There is a usecase when we want to try our release management or some experiments against copy of real production storage.

  • Jan Vilimek commented  ·   ·  Flag as inappropriate

    Possibility to "Clone" Azure storage (same as Copy Database for SQL Azure) is a must for our CI environments..we need to have the possibility to copy easily and most importantly quickly the entire storage for each CI environment (several times per month for about 20-30 CI environments)...

  • Matt Allison commented  ·   ·  Flag as inappropriate

    Pilling on to what I said before. AzCopy is cute but ultimately isn't working for me. I could use any or all of the following:
    1) Ability to copy a locked VHD
    2) Ability to create a snapshot (or series of rolling snapshots - ex: /max:3 which would eliminate older ones when taking the new one)
    3) Ability to copy the last snapshot (ex: /lastsnap) to a usable VHD

  • Matt Allison commented  ·   ·  Flag as inappropriate

    If I could vote this up a million times I would. I cannot possibly overstate how important it is to be able to clone/backup/copy an entire Storage Account (up to and including every Storage Account within a Subscription) to an alternate location (different account/subscription/region).

    Without that, every Azure subscriber is just one step away from being the next CodeSpaces.

  • Dervanil Junior commented  ·   ·  Flag as inappropriate

    I think this is a must have, because i'll definitely use this to transfer my cloud service to the recently opened data center in brazil without any chance of data loss.

Feedback and Knowledge Base