Update: Microsoft will be moving away from UserVoice sites on a product-by-product basis throughout the 2021 calendar year. We will leverage 1st party solutions for customer feedback. Learn more here.

Azure Synapse Analytics

We would love to hear your ideas for new features for Azure Synapse Analytics. Below, enter a new idea or upvote an existing one. The Synapse engineering team pays attention to all requests.

If instead you need a technical question answered or help, try the these options: DocumentationMSDN forum, and StackOverflow. If you need support, please open a support ticket.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Having a link to the run notebook from a pipeline

    At the scenario where we have a notebook activity in a pipeline, when we run this pipeline, to have the option to see the notebook which run. A similar behaviour exists at ADF for databricks. When running a databricks notebook from ADF, at the output there is a link to the run notebook.

    5 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Workspace/Pipelines  ·  Flag idea as inappropriate…  ·  Admin →
  2. ADF Powerbi Dataset Refresh Activity

    Please consider creating a ADF activity to allow the pipeline to refresh a Powerbi dataset within the pipeline.

    3 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Workspace/Pipelines  ·  Flag idea as inappropriate…  ·  Admin →
  3. Add parameters to Synapse Notebooks

    Please add possibility to pass parameters in pipelines to Synapse Notebooks.

    20 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  Workspace/Pipelines  ·  Flag idea as inappropriate…  ·  Admin →
  4. Add SQL Scripts as an "activity" to Azure Synapse Analytics workspace

    My ideia is to add "SQL Scripts" as an "activity" to Azure Synapse Analytics workspace in order to have the possibility to create triggers.

    In the "Integrate" tab we can already create pipelines with a lot of different activities such as stored procedures, spark notebooks, data flows and etc, but we cannot create a pipeline with sql scripts direct from the "Develop" tab. With this feature we would be able to trigger SQL scripts direct on Synapse Analytics workspace with no need to create stored procedures.

    4 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Workspace/Pipelines  ·  Flag idea as inappropriate…  ·  Admin →
  5. Notebook Preview support in Pipelines

    It would be nice to have support for notebooks preview feature in the pipelines if we enable it. An example is if i use the %run <notebook> feature after I enable the preview features I can run that manually but if I put it in the pipeline it doesn't recognize it.

    2 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Workspace/Pipelines  ·  Flag idea as inappropriate…  ·  Admin →
  6. Spark Job Definition - Add Parameters

    Spark job should allow parameters to be passed from a pipeline. I want to run the same job many times in a loop passing in variable for each iteration.

    11 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Workspace/Pipelines  ·  Flag idea as inappropriate…  ·  Admin →
  7. Export query result in SQL on-demand to external storage in Synapse Studio

    Currently synapse workspace doesn't support to create a linked service specifically for SQL on-demand. Thus since we were unable to create a dataset for it without linked service, we cannot leverage the copy activity in ADF pipeline to export data in SQL on-demand to storage account.
    The current workaround for this issue is to create a linked service as Azure SQL database and manually add SQL on-demand endpoint as the server name.

    However we were wondering if it is possible to add a type of linked service specifically for SQL on-demand, or if it is possible to add a note…

    2 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    planned  ·  0 comments  ·  Workspace/Pipelines  ·  Flag idea as inappropriate…  ·  Admin →
  8. Monitor consumption of individual pipelines

    This feature is available in Data Factory but not in Synapse. We really need this to understand the cost of pipelines.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Workspace/Pipelines  ·  Flag idea as inappropriate…  ·  Admin →
  9. Show references for datasets just like the 'related' tab in a linked service

    It would be a very useful feature to have the ability to trace where a specific dataset is used just like in the related tab for linked services. This would allow us to more easily clean up datasets that are no longer in use.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Workspace/Pipelines  ·  Flag idea as inappropriate…  ·  Admin →
  10. Import pipelines programatically

    Automate pipeline creation via JSON and Powershell or API.

    4 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Workspace/Pipelines  ·  Flag idea as inappropriate…  ·  Admin →
  11. Support Cosmos script on Synapse pipeline

    Azure Data Factory is supporting Cosmos script activity as part of pipeline, however, Synapse doesn't support Scope script. Is there any plan for adding support on Scope script?

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Workspace/Pipelines  ·  Flag idea as inappropriate…  ·  Admin →
  12. Accept a config file as an input for debug sessions

    During a data flow debug session, parameters have to be manually entered into the fields (for both data flow parameters and pipeline parameters) under the debug parameter tab. Having an ability to upload a config file with those values will accelerate the development time.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Workspace/Pipelines  ·  Flag idea as inappropriate…  ·  Admin →
  13. need to delete pipelines, linked services, datasets automatically in synapse which are unused.

    Actually we published the pipeline into new workspace but there are some other pipelines, linked services, datasets which are unwanted so I deleted them manually but I want to automate them by creating a build a release so, it can automatically delete the unwanted ones.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Workspace/Pipelines  ·  Flag idea as inappropriate…  ·  Admin →
  14. Support for SDK like ADF

    Currently, SDK for Synapse is not available to be able to deploy pipelines via code.

    Enabling this support will enable us to migrate our existing tools.

    9 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Workspace/Pipelines  ·  Flag idea as inappropriate…  ·  Admin →
  15. Hourly schedule like SQL Agent (Start/Endind at)

    Currenltly it is possible to start the trigger execution by the hour, but the SQL Agent allows a better approach because it includes the start / end time during the day. It facilitates to process only during business hours, for example. Please consider it, thanks!

    4 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Workspace/Pipelines  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Feedback and Knowledge Base