Azure Synapse Analytics
-
Add parameters to Synapse Notebooks
Please add possibility to pass parameters in pipelines to Synapse Notebooks.
19 votes -
19 votes
-
ALTER EXTERNAL DATA SOURCE
If a data source moves or changes then all external tables must be deleted, the data source deleted and recreated, then the external tables recreated.
This may be experienced during a data warehouse migration, and during a disaster recovery.
Support for ALTER EXTERNAL DATA SOURCE would make this a much more simple task.
19 votes -
enable DML-Trigger
Table Trigger are very helpful f.e. auditing DML-code execution.
19 votes -
Polybase - fix file format type default for decimals
Currently it's not possible to load a file with empty decimal value and convert it to 0 when loading data.
This will load fine:
1;2.1;"zzzz"This will fail:
1;;"zzzz"According to the documentation( https://msdn.microsoft.com/en-us/library/dn935026.aspx) when USETYPEDEFAULT = TRUE is set in the file format definition empty value should be converted to 0. While it works for Integers, it doesn't work for decimals. USETYPEDEFAULT = FALSE will work fine for decimals (but of course the value will be loaded as NULL).
Code:
CREATE EXTERNAL FILE FORMAT textfileformatraw
WITH
(
FORMATTYPE = DELIMITEDTEXT, …19 votesThank you for reporting this issue. We have identified this as a product defect and plan to fix this behavior in an upcoming release. 7297383.
-
Add support for the truncate partition option
Add support for the option to truncate a single partition in a table (as per sql azure and sql 2016).
https://docs.microsoft.com/en-us/sql/t-sql/statements/truncate-table-transact-sql
18 votesThank you for your request. This item is on our backlog. We will update once the item state changes.
-
Enable distribution elimination in the generated query plan if the query predicate contains the distribution key column as a filter
Say we have the following table:
CREATE TABLE [dbo].[Foo]
(
[columnWithGuids] [uniqueidentifier] NULL,
[column1] [int] NULL,
[column2] [bigint] NULL,
[ingressDay] [date] NULL
)
WITH
(
DISTRIBUTION = HASH ( [columnWithGuids] ),
CLUSTERED COLUMNSTORE INDEX,
PARTITION
(
[ingressDay] RANGE RIGHT FOR VALUES (N'2016-01-01T00:00:00.000', N'2016-01-02T00:00:00.000')
)
)
And we have a query:
SELECT
SUM([colum1]) AS [Total]
,[ingressDay]
FROM [dbo].[Foo]
WHERE [columnWithGuids] = '40300bab-03aa-8d51-7ecb-904dfaf47ec7'
GROUP BY
[ingressDay];
Currently, Azure SQL DW does not eliminate distributions to interrogate based on a predicate. Which means, that in the above case, even though only 1 distribution will always have the data we're looking for, it will…
18 votes -
Remove Scalar Function limitiations
Currently we can not do the following:
use Global variables within UDF, which makes us to send it from the front end as a value. eg; @@DateFirst
reference tables within the UDF
reference temp tables within a UDF18 votesThanks for your suggestion. We are looking into this scenario for a future release. 10703219
-
SSDT: Support Dynamic data masking
Support for Dynamic data masking
17 votes -
Support Avro format
Please add support in Azure SQL DW for Avro files.
17 votes -
Synapse Analytics Maintenance
Currently Synapse Analytics instances below DWC400c are not notified of maintenance and maintenance schedules can not be created for instances DWC400c and under.
Allow all levels to have maintenance performed within maintenance schedule.
16 votes -
Support of iif/choose logical function
Azure SQL DW does not support IIF and CHOOSE logical function while SQL Server, SQL DB and SQL MI support it. It would be good to support them to reduce the gap between SQL services.
16 votes -
Enable AD authentication for Polybase
Currently SAS is the only option for Polybase to access blob storage or datawarehouse.
16 votesThanks for the request! Currently PolyBase only supports Storage Account Keys when connecting to Azure Storage Blobs. SAS token support is on our backlog.
Today, Storage does not support AD as an authentication method.We will update when we move this feature through the backlog.
-
16 votes
-
Functionality to upload data to ADW where upload queries do not count in 32 concurrent queries limit
Today we use PDW/APS and when we upload data to PDW using DWLoader, it does not count towards 32 concurrent query limitations, this makes sure that users have all of query window to use datawarehouse.
However in ADW, when we are ingesting data with Azure data Factory or manually using Polybase, since technically its a query, it counts towards 32 concurrent query limit.
Would be great to have a feature similar to PDW.16 votesThank you for the request. This item is on our backlog. We will update the item when the state changes.
-
Performance with respect to dynamic SQL
In our solutions we load data from csv-files into SQL DW internal tables using polybase/external tables.
Per roll-out we face the challenge that the schema (columns) in the csv may differ for the same table. Therefore we implemented a logic to lookup the column name and data type from a dictionary table and we create the external and internal table schemes dynamically. As we have round about 500 tables with up to 20 columns per table automating this process is the only way to go.
The issue we face is that compared to an on premise SQL Server the dynamic…
16 votesIs this the looping execution time or the actual time of creating the tables?
-
Using Global parameters
Is it possible to add the use of global parameters in Synaps management studio? Exactly the same as in Azure data Factory. This makes life so much easier when working with differente Environments.
15 votes -
Allow to share a self hosted IR from Data Factory to Synapse Studio
As of now, it is not possible to reuse an existing Integration Runtime used in a Data Factory with Synapse Studio. This makes the migration and testing process much more complicated than it should be.
15 votes -
Use the same CREATE USER command as Azure SQL DB
Currently, Azure SQL DB supports a version of CREATE USER that allows an SPN to connect to the database via an access token and create Azure AD users. Azure SQL DW does not, and fails with an error from Azure AD. SQL DW requires an authenticated AD user be logged into the database directly and issue the CREATE USER command.
Making this function work the same on both platforms will allow a single DevOps deployment to create either DB and DW resources, as well as configure users for those resources, from within the deployment script.
15 votes -
Hive metastore integration for polybase / Azure SQL DW
Hive metastore integration for polybase / Azure SQL DW
I want to be able to seamless access/ import /join on tables that are already captured in my common hive metastore. This would significantly help integrating DW in our big data infrastructure and eliminate the huge duplicated maintenance effort of keeping external table definitions in sync.15 votesThanks for your suggestion. We are looking into this scenario for a future release. 6891253
- Don't see your idea?