Shrinkdatabase command should to be supported even if TDE is on or off because extra cost is charged in case that the unallocated space is huge.67 votes
SQL Data Warehouse scale stepping should be much smaller (10 DWU or even 1 DWU), than 100 DWU!
Please fix that ASAP!
Thank you!66 votes
Thank you for all the feedback folks. We are currently evaluating options for lower click stops for SQL Data Warehouse. Please continue to vote and comment on your scenario below. Thank you for your patience.
Resource Governance - Resource Pools - Control CPU, physical IO, memory, priority, run-time cap, max request, concurrency, request timeout..
1. Ability to manage workloads effectively
2. Enables to specify limits on the amount of CPU, physical IO and memory
3. User-Defined Resource Pools
a. Memory size
b. Memory cap
d. Maximum requests
e. Grant time-out
h. Run-time cap62 votes
We are in the very early stages of planning this improvement.
Currently SQL DW doesn't have the capability to specify longer term retention capabilities as released for SQL DB recently. It would be great to have this capability for compliance and auditing requirements.59 votes
Thank you for your patience. This is currently on our road map. Please continue voting for this feature and comment on your scenario below. We’ll reach out when the feature is available to everyone who voted.
You can see what the database is scaled to i.e. DWU 200, but how do you know how much is actually being used over time. The portal display a graph of both the DWU limit and the DWU used but there is no way to programmatically monitor how much is being used.27 votes
We are actively improving our monitoring experience. Currently we have ‘DWU Used’ in the portal which is a blend between CPU and IO to indicate data warehouse utilization. We also have future improvements on our road map such as Query Data Store and integrating with Azure Monitor for near real time troubleshooting in the Azure portal. If anyone has any other feedback, please elaborate on your scenario on this thread. Thank you for your continued support!
It would be great if table statistics were automatically created and updated in Azure Data Warehouse.26 votes
Working on now. Should be out in the next few months!
It would be great if we can have SQL command to Pause/Resume DB instance like how we do for Scale up/down (under Master DB). This will help us to manage the same from ADF and don't have to worry about trigger REST API or PowerShell22 votes
Thank you for voting for this feature. We understand this scenario and will consider this in a future release. For now, please continue monitoring this item and have your team vote for this feature. Thank you for your patience.
The charts in the SQL DW blade in the portal and the ability to add alerts are very helpful. Please add additional metrics. The two metrics that I think would be helpful are:
Number of Queued Queries (meaning after you've exhausted your 32 concurrent queries or your concurrency slots or the queued query needs more concurrency slots than are available, queries get queued)
Number of Concurrency Slots Available (the sum of concurrency slots used by currently running queries... would help surface whether you need to scale to more DWUs)22 votes
Thank you for voting for this folks! This is on our radar. We have plans to improve workload management for Azure SQL Data Warehouse which includes monitoring capabilities within the Azure portal. We’d love to hear your feedback so please comment on your scenario below.
A customer asked if there are plans to implement some sort of elastic pools where you can include many data warehouses and share DWU (eDWU?) the same way elastic database pools work with SQL Databases.19 votes
Thank you for all the feedback folks. This is on our radar and we will consider this in a future release. We will reach back out when it is on the roadmap and can share when we have an update. Please vote for this feature and comment on your scenario below.
Increase the concurrency limit from 32 to unlimited and keep it tweakable , so the customers can vary it according to their needs14 votes
Concurrency was increased to 128. Unlimited concurrency requires further analysis.
At the moment the automated backup process take place every 8 hours unless the database is paused. Upon resumption of the database, it appears that it has to be online for 8 hours before the next backup is taken.
If resuming and then pausing the system again it is possible to go for a long duration without backups. Could you please evaluate the potential of taking an automatic backup either when the database is paused or resumed.9 votes
We are actively exploring ways to enable event-driven backups. Currently this scenario can be addressed by User Defined Restore Points which is on our road map for this calendar year. Stay tuned and thank you for your patience.
I should not have to start Azure SQL Data Warehouse, wait for it to resume then issue a scale and wait again.8 votes
Thank you for your feedback. We understand this scenario and will consider this in a future release. In the meantime, please have your team vote for this feature and stay tuned for an update on this item. Thank you for your patience.
When using Polybase to load into Data Warehouse via Data Factory, Control permission on the database is required for the user.
Can this be limited to a Schema Owner, or be more granular at the database level ?7 votes
Create a simple columnstore table and look at sys.stats:
create table dbo.systemDefinedStatBug with (clustered columnstore index, distribution=round_robin)
select 1 as column1, 2 as column2;
from sys.stats s
where s.object_id = object_id('dbo.systemDefinedStatBug')
Notice it has a system generated "statistic". It use to be that row was marked as user_created=0 so we could easily tell that's not a regular user created stat. But these days in SQL DW it says user_created=1 on this "stat". Please fix this bug.5 votes
The LABEL column in the sys.dm_pdw_exec_requests DMV is limited to 255 characters.
This field needs to be much larger to store rich metadata/context of some queries.4 votes
Thank you for all the feedback folks. Unfortunately this is taking longer than we’d like. We will reach back out when it is on the roadmap and can share when we have an update.
sp_send_db_mail needs to be supported to send mails from stored procedures created on Azure.
We should have the abilities of sp_send_dbmail which was available with databases3 votes
Thank you for all the feedback folks. Please comment on your scenario below. You can create Azure alerts for metrics and logs along with Azure functions to send emails.
Persist Data dictionary to a separate database by allowing the customer to persist the DMV, which can be used by DBA for fixing performance issues3 votes
Thank you for voting for this feature folks. We’d like additional information before commenting on this topic. Please provide your scenario in the comments below. We currently have Query Data Store on our roadmap.
It does not seem possible to change the default language to anything other than English US. I would like to be ale to change the language to British. I have tried at user and db level and neither seem to be supported.
EXEC sp_configure 'default language', 23 ;
ALTER LOGIN [...] WITH DEFAULT_LANGUAGE=[British];3 votes
Implement the sys.dm_db_stats_properties DMV to expose the mod_ctr to get a better idea of when stats should be updated. using stats_date isn't a complete solution. as you have no idea if any rows have changed since that date2 votes
Thank you for all the feedback folks. We are continuously improving the manageability experience with SQL Data Warehouse which includes automatic statistics. We will reach back out when this is on the roadmap and can share when we have an update.
Allow install of Linux ODBC to be automated via puppet. The current Microsoft SQL Server ODBC packages can't be installed via puppet as they require EULA acknowledgment. If we adopted the ADW technology, we would inevitably have to install the ODBC software on several grids of computers that host our various ETL software. Puppet installation allows the implementers of new systems to get the software automatically installed without the DBA team's intervention.2 votes
Thank you for the feedback. Please follow up on this request with our tooling and drivers team as well. We will reach out when this is on our road map.
- Don't see your idea?