Storage
-
Why create Storage accounts for each service when I already have an Storage.
If there is a Storage account created only containers are enough to Map for any Service that we create.
Also when we select any Storage Accounts that should show the SQL, VM or any resources that is using that storage.
1 vote -
Azure File Sync agent to follow NTFS symbolic links
Allow the Azure File Sync agent to follow NTFS symbolic links - these are transparent to the OS so assuming efforts were made to disable this - why??
Had an amazing idea on how to apply this but ignoring symbolic links has made this impossible.
confused when considering any possible reason for this :(
1 vote -
see how many files are synced, and which ones are being automatically excluded
In order to increase my confidence that all of my files are being synced, it would be nice to be able to see a count of how many files are in the synced storage account in the Azure portal, so I can compare to the number of files in the on-premises folder. Also, it would be nice to give a list of the files that were automatically excluded (such as desktop.ini, ~., *.tmp, thumbs.db), so if the counts differ, I can see if it's because of these.
1 vote -
Azure Storage Explorer handling of multiple tenants and subscriptions
Storage Explorer does not separate or segregate subscriptions, storage accounts, or containers in a useful way for users. This leads to a slow and cumbersome experience for users. Need to add functionality related to listing subscriptions, storage accounts, and containers in those accounts are separated and segregated so that browsing the intended data is straight forward.
1 vote -
More options in the rule filter of Lifecycle Management
The rule filter of Lifecycle Management should have more options to fit various scenarios. In my case, I need to delete all files in a blob folder except of one file. I found there is no "exclude" option for me to create such rule. It would save me tremendous work if such rule can be created. Thanks!
1 vote -
automatically save back to the file share
since we cannot map to file shares because Comcast blocks 445 it would be nice if when using the storage explorer for opening files they would automatically save back to the file share instead of the local drive. It is not really a file "share" if the changes are only locally stored and the user then has to take another step to upload which they will most likely never do.
1 vote -
Export Import Management
Eximpulse is best free search engine for import and export shipment data related to India, US , China , Russia, Sri Lanka etc. We are providing International Import Export. It is more helpful for getting Imports & Exports related information, where you can full fill your requirements in very easy way.
Visit Here:
https://www.eximpulse.com/1 vote -
link broken
Broken like on https://docs.microsoft.com/en-us/azure/storage/common/storage-samples-java#feedback
"Client-Side Encryption": https://github.com/Azure-Samples/storage-java-client-side-encryption
"Get Page Ranges": https://github.com/Azure/azure-storage-java/blob/master/microsoft-azure-storage-test/src/com/microsoft/azure/storage/blob/CloudPageBlobTests.java
"SAS": https://github.com/Azure/azure-storage-java/blob/master/microsoft-azure-storage-test/src/com/microsoft/azure/storage/blob/SasTests.java
"Add Message": https://github.com/Azure/azure-storage-java/blob/master/microsoft-azure-storage-samples/src/com/microsoft/azure/storage/queue/gettingstarted/QueueBasics.java1 vote -
Azure File Sync missing file on other site if file is being use
We have setup and testing Azure file sync for 2 sites (Site A and B) and everything is working and syncing fine until for some reason whenever a file (.dwg) is being use and model on site B, the same file will be missing on site A (vice-versa). The only way for the file(s) to re-appear again is to close the file, wait for few minutes, and it will show again to the other site. Does this happened to other people as well?
1 vote -
Import/Export Service Add ability to prepare already copied data to the drive for importing to Azure. Our application have some data storag
Import/Export Service: add ability to prepare already copied data to the drive for importing to Azure.
Our application have some data storage, and we do not copy data from disk to disk.
We read data from our storage and want to write it as a stream directly to device, which will be shipped to Azure.For using WAImportExport tool we must write it on local drive and then copy through this tool. There is a lot of data, so we need additional storage to copy it local. This does not suit us, we want to avoid this.
And of course…
1 vote -
Support hierarchy of directories for export job
for example, if I have blob with name '\root\directory\file1.txt
I want to have ability to see on drive which will be shipped to me during export job directory "root", subdirectory "directory" and inside file "file1.txt"1 vote -
Passwords for Hadoop creation are rough/silly?
10 characters, lower+upper case, number, special character? Even my onePass fully-hashed passwords don't meet this criteria. If there is a security policy that requires passwords to be that long, then can you please show all error warnings at the same time, such that I don't have to try three different passwords (because I didn't know it needed to be 10 characaters long, etc.)
1 vote -
Azurefilesync server High-availability and backup in case Server goes down
For High-availability of Azure Filesync agent at on-premise, can this work on the cluster mode , if so how wil be setup and what about the backup of this agent server and impact of virus scan on the volume which is synced with Azure Fileshare in cloud.
1 vote -
Lifecycle management to include rules for Access Tier Last Modified
If re-hydrating from Archive, it would be great to have the lifecycle rules recognise this metadata and not re-send the blob back to Archive with the rules still enabled.
Annoying having to disable rules, then negotiate with whoever needs to the data to download it, then re-enable the rule.
This would allow "Move to archive if older than 30 days (last modified) AND Tier Last Modified is older than 30 days"
1 vote -
https://azure.microsoft.com/en-us/free/free-account-faq/ have one sentence that's not right. can you fix to avoid misunderstanding for new
https://azure.microsoft.com/en-us/free/free-account-faq/
this link contains the error on one sentence ==> 128 GB of Managed Disks as a combination of two 64 GB (P6) SSD storage, plus 1 GB snapshot and 2 million I/O operations
Please, add the word (Standard OR premium) before SSD in above line to avoid customer misunderstanding.
SR # 119030726003439
1 vote -
Include separate Creation Time column for blobs in Storage Explorer. It only contains Last Modified Time
Include separate Creation Time column for blobs in Storage Explorer. It only contains Last Modified Time.
1 vote -
Deleting Azure database now prevents you from recreating it
In the last month something changed on the Azure portal. Now when I delete a database, I can't recreate it for many minutes because creating it fails saying that there is already a database with that name. This is new undesirable behavour. All last year I was able to delete my test database, and then turn around and recreate it immediately from a back up of production. Now I have to way many minutes. What changed? Why?
This is not a good change as it kills productivity.
1 vote -
Ensure help texts use the same vocabulary as the controls they're hinting
If I have the portal configured to use Spanish and I create a new blob storage account, one of the options is "Nivel de acceso" with a choice between "Estático" and "Dinámico". I guessed that "static" means "cold" and "dynamic" means "hot", but I wanted to confirm so I looked at the hoverover help text.
"Al elegir un nivel de acceso, puede especificar el padrón de acceso para los datos de la cuenta de Blob Storage. El nivel de acceso activo es ideal para los datos a los que se accede frecuentemente. El nivel de acceso inactivo está optimizado para…
1 vote -
1 vote
-
how to change blob access tier in .net?
i just want to know if its possible to change the access tier of a blob using the .net client library?
If i'm trying to do something that it isn't intended for let me know. but from everything i have read there is a lot of talk about what its used for but not how exactly its done in code.1 vote
- Don't see your idea?