Thanks for reporting this. We are looking into this. 10703354
Thank you for voting for this feature! We are aware of this scenario and are looking into ways of supporting this and improving our query troubleshooting experience. In the meantime, stay tuned for an update and please continue voting for this feature.
Working on now. Should be out in the next few months!
Sorry for the confusion, there was a regression during deployment in the Large Object support that caused a rollback of the functionality. We are actively working on getting a fix.
I did some research on this and totally agree. It seems like a very small enhancement to SQL DW would be all that's needed.
If SQL DW could add support for the following GRANT statement then we could do ADF and Polybase with reasonable permissions. Please support:
GRANT ALTER ANY DATABASE SCOPED CREDENTIAL to adf_user
Once that is done, then the following existing grants which are currently working will provide enough permissions to ADF to use Polybase I think:
--if some admin has previously run CREATE MASTER KEY then all ADF needs is to select from sys.symmetric_keys where symmetric_key_id = 101 to confirm that's done already
GRANT VIEW DEFINITION TO adf_user
--allows ADF to create a new external data source
GRANT ALTER ANY EXTERNAL DATA SOURCE TO adf_user
--allows ADF to create a new external file format
GRANT ALTER ANY EXTERNAL FILE FORMAT TO adf_user
--gives ADF permissions needed to the dbo schema where ADF currently puts external tables
GRANT ALTER ON SCHEMA::dbo TO adf_user
--gives ADF permissions to create an external table
GRANT CREATE TABLE TO adf_user
Thanks for the feedback. To help us further evaluate the ask, kindly comment more on under which scenario you want to extract data out from SQL DW.
One scenario we could use this is creating a second DW and copying a subset of tables from DW1 to DW2 via ADF Copy Wizard (which should export to blob and then import from blob). This can be common with large enterprises which have hub and spoke DWs.
There are dozens of reasons you may want to export from SQL DW to blob. Please optimize it so it runs fast using Polybase.
Thank you for voting for this feature. We understand this scenario and will consider this in a future release. For now, please continue monitoring this item and have your team vote for this feature. Thank you for your patience.
Thanks for this feedback. This is something that has been in our backlog, and having your input helps us to prioritize this.
You can currently write your own runbook to use to check in other artifacts as a workaround. You may want to take a look at http://blogs.technet.com/b/privatecloud/archive/2014/05/08/automation-mvp-spotlight-series-tfs-and-service-management-automation-better-together.aspx to see how this was done with TFS and convert to modules.
Thanks for feedback
Looking into it. Thanks.
Planning to review in more detail when we get to scale out soon
Currently focusing on remaining features to get us to GA with tabular. Will confirm timelines/availability for MD as soon as available.
We are actively improving our monitoring experience. Currently we have ‘DWU Used’ in the portal which is a blend between CPU and IO to indicate data warehouse utilization. We also have future improvements on our road map such as Query Data Store and integrating with Azure Monitor for near real time troubleshooting in the Azure portal. If anyone has any other feedback, please elaborate on your scenario on this thread. Thank you for your continued support!
Thank you for voting for this feature! We are aware of this scenario and are looking into ways of supporting this. In the meantime, stay tuned for an update and please continue voting for this feature.
There was a recent announcement of the Azure SQL Database Long-Term Retention feature. Maybe adding SQL DW support to that feature would accomplish this request?
The current approach long term backup by restoring as a new DW that stays paused was a good idea until the premium storage switch. Now it's too expensive.
Thanks for the suggestion. We are looking into this scenario for a future release. 7557959
I should have posted the error message I get:
Msg 107090, Level 16, State 1, Line 11
Query aborted-- the maximum reject threshold (0 rows) was reached while reading from an external source: 1 rows rejected out of total 1 rows processed.
Column ordinal: 0, Expected data type: DATE, Offending value: TemporalValue: 3876 (Column Conversion Error), Error: Error converting from RCFile Type DATE to Sql type Timestamp.