Azure Synapse Analytics

We would love to hear your ideas for new features for Azure Synapse Analytics. Below, enter a new idea or upvote an existing one. The Synapse engineering team pays attention to all requests.

If instead you need a technical question answered or help, try the these options: DocumentationMSDN forum, and StackOverflow. If you need support, please open a support ticket.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Delta lake support to SQL on Demand

    Currently, Delta lake table is not supported to be queried using SQL on Demand. Adding this functionality would help to greatly use SQL on-demand for analyzing the delta lake table.

    This will save time to query any Spark database delta lake table without spinning spark pools.

    64 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    under review  ·  14 comments  ·  Workspace/SQL Scripts  ·  Flag idea as inappropriate…  ·  Admin →
  2. Add support for adding custom .jars to spark clusters

    Enable a way to refer and add a custom .jars when initiating a Synapse Spark cluster to enable Synapse notebooks to import from the .jars

    38 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    3 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. support for R notebook

    Would love to have support for the language R in the notebook experience of Synapse Studio.

    24 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    3 comments  ·  Workspace/Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
  4. OPENROWSET for dedicated pools

    Please add the OPENROWSET functionality dedicated pools.
    Advantages:
    1. More (and faster) parser options over External File Format such as row delimiter
    2. Can auto-infer schema
    3. More convenient to define the file format directly
    4. syntactical harmony between serverless and dedicated.

    23 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    under review  ·  0 comments  ·  SQL/Polybase  ·  Flag idea as inappropriate…  ·  Admin →
  5. Add support for "Delta Lake" file format in Azure Data Lake Store / HDFS

    Today we can query data stored in parquet files on ADLS. It would be fantastic to extend this to support the new "Delta Lake" file format recently open-sourced by the DataBricks team ( see https://delta.io )

    This would allow us to take advantage of ACID guarantees that the delta format brings to the data lake.

    233 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    under review  ·  18 comments  ·  SQL/Polybase  ·  Flag idea as inappropriate…  ·  Admin →
  6. Using Global parameters

    Is it possible to add the use of global parameters in Synaps management studio? Exactly the same as in Azure data Factory. This makes life so much easier when working with differente Environments.

    15 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  Workspace/Management  ·  Flag idea as inappropriate…  ·  Admin →
  7. Enable CDM support for Azure Synapse Analytics SQL on-demand

    The "Common Data Model" (CDM) format is becoming increasingly popular. Therefore it would be important that this connector not only exists in the ADF, but that it is also possible to read and write (via CETAS) in CDM directly from SQL on-demand.
    READ:
    SELECT * FROM

    OPENROWSET(
    
    BULK STORAGE ACCOUNT,
    FORMAT = CDM
    ) AS [r];

    WRITE:

    CREATE EXTERNAL TABLE cdm.cdm_table

    WITH (
    
    LOCATION = path_to_folder,
    DATA_SOURCE = external_data_source_name
    FILE_FORMAT = CDM

    ) AS SELECT * FROM source

    19 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    under review  ·  1 comment  ·  SQL/Integration  ·  Flag idea as inappropriate…  ·  Admin →
  8. Support manage connection policy

    Add the ability to change the connection policy, like Microsoft.sql
    I want to apply Redirect to accesses from outside Azure to optimize throughput.

    https://docs.microsoft.com/en-us/cli/azure/sql/server/conn-policy?view=azure-cli-latest

    9 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Workspace/Management  ·  Flag idea as inappropriate…  ·  Admin →
  9. Make Azure Firewall Failure Logs available to Customers

    Most Azure products and services have a firewall that can be configured to restrict access. A valuable feature of configuring firewalls appropriately and troubleshooting connectivity issues is being able to see which connection attempts are blocked by the firewall rules. This feature is very common on other commercial firewall products. Please add this ability so that users can review firewall logs for products like SQL Server and others.

    10 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  SQL/Security  ·  Flag idea as inappropriate…  ·  Admin →
  10. [Synapse Studio] Spark pool - To shorten execution time for more than 2 Notebooks

    Request to shorten execution time on pipeline for more than 2 notebooks on Synapse Studio.
    From my observation, the reason why it takes time to run each notebook might be each needs to establish Spark session that takes time.

    Seeing parameters on Spark pool, there are not useful parameters to achieve the requirement.

    If same one spark pool is being used on pipeline across multiple Notebooks, please consider same session can be used to shorten total execution time.

    7 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. SQL Data Tools for Synaspe Serverless

    https://cloudblogs.microsoft.com/sqlserver/2019/11/07/new-in-azure-synapse-analytics-cicd-for-sql-analytics-using-sql-server-data-tools/ describes how SQL Server Data Tools support for Synapse not mentioning that this is only for Dedicated Pools. We need similar support for Synapse Serverless as well otherwise CI/CD will be hard to handle.

    8 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  SQL/T-SQL  ·  Flag idea as inappropriate…  ·  Admin →
  12. EXPLAIN PLAN syntax for serverless SQL pool queries

    The ability to run EXPLAIN PLAN syntax for serverless SQL pool queries Azure Synapse Analytics.

    6 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  Workspace/SQL Scripts  ·  Flag idea as inappropriate…  ·  Admin →
  13. Drop External Tables

    when executing DROP External Table statement, only the metadata of the table gets deleted not the external file in the ADLS directory and we have to manually delete the files from the storage account.

    would be nice if this is taken care automatically when dropping the external table

    7 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  SQL/T-SQL  ·  Flag idea as inappropriate…  ·  Admin →
  14. Allow private hosted pip repositories in Synapse

    Allow private hosted pip repositories in Synapse,
    private packages are commonly used and currently, the workaround of copying all the code is pretty obnoxious

    7 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Workspace/Spark  ·  Flag idea as inappropriate…  ·  Admin →
  15. Need %pip installer within the workspace notebook, so we can install libraries as needed immediately

    Need %pip installer within the workspace notebook, so we can install libraries as needed immediately

    8 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Workspace/Spark  ·  Flag idea as inappropriate…  ·  Admin →
  16. Having a link to the run notebook from a pipeline

    At the scenario where we have a notebook activity in a pipeline, when we run this pipeline, to have the option to see the notebook which run. A similar behaviour exists at ADF for databricks. When running a databricks notebook from ADF, at the output there is a link to the run notebook.

    5 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Workspace/Pipelines  ·  Flag idea as inappropriate…  ·  Admin →
  17. CETAS specify number of parquet files/file size

    Parquet number of files, file size, and parquet row group size greatly influence query speeds of Synapse serverless. Yet parquet file creation through CETAS cannot be configured in any way except for the type of compression. Moreover CETAS is not consistent in parquet file creation; the generated parquet file size and the number of files created varies wildly. I've seen CETAS queries return a single 1.5GB file, or dozens of 1MB files. Given this behavior, it is very hard to use CETAS as part of a production data pipeline, at the moment it is more of a prototyping tool.

    6 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. 6 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  SQL/Management  ·  Flag idea as inappropriate…  ·  Admin →
  19. Subset of users to drop Serverless external tables via standard SQL optionally and have the underlying files deleted

    We need the ability for a subset of users to Subset of users to drop Synapse Serverless external tables via standard SQL and have the underlying files deleted via standard SQL without knowing about the underlying storage blob containers. SQL command option (DROP FILE) that will allow a user to drop a table without having to manually delete a file from storage.

    6 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Workspace/Other  ·  Flag idea as inappropriate…  ·  Admin →
  20. GRANT IMPERSONATE ON USER on ServerLess Pool

    GRANT IMPRSONATION ON USER::User1 TO User2 is not working for the serverless pool in synapse although Microsoft’s documentation says it is applicable. overall Grant Database Principals doesn't work in Serverless pool.

    4 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
← Previous 1 3 4 5 22 23
  • Don't see your idea?

Feedback and Knowledge Base