Update: Microsoft will be moving away from UserVoice sites on a product-by-product basis throughout the 2021 calendar year. We will leverage 1st party solutions for customer feedback. Learn more here.

Data Factory

Azure Data Factory allows you to manage the production of trusted information by offering an easy way to create, orchestrate, and monitor data pipelines over the Hadoop ecosystem using structured, semi-structures and unstructured data sources. You can connect to your on-premises SQL Server, Azure database, tables or blobs and create data pipelines that will process the data with Hive and Pig scripting, or custom C# processing. The service offers a holistic monitoring and management experience over these pipelines, including a view of their data production and data lineage down to the source systems. The outcome of Data Factory is the transformation of raw data assets into trusted information that can be shared broadly with BI and analytics tools.

Do you have an idea, suggestion or feedback based on your experience with Azure Data Factory? We’d love to hear your thoughts.

  1. Provide more ways to Log into GitHub

    For Data Factories connect to GitHub, provide other ways to log in. For example out-of-band authentication could be performed using OAuth Device Code flow or logging in with a GitHub Personal Access Token. For setups where your account to manage Data Factory is not the same as the account you use with GitHub is not the same, it is currently very difficult to connect the two services. Especially in high security environments where Conditional Access etc. impose further restrictions on accounts.

    46 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. Add multi-factor authentication for SFTP connector

    The existing SFTP connector in Azure data factory is not supporting multi-factor authentication. The connector supports either password based authentication or key based authentication. Enterprises are moving towards multi-factor authentication requiring both key and password for SFTP. This is a must have feature given the information security focus.

    72 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    4 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. Please allow users to automate “Publish”.

    Now, somebody must get to the dev ADF, open the master branch, and hit “Publish” for the changes to get the “Adf_publish” updated.

    Automating “Publish” on the master branch is necessary for improving efficiency and saving more time.

    168 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    started  ·  6 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Select all option on Trigger

    Hi Team,

    We do have hundreds of triggers and it will be growing, As part of the BAU support if we need to disable the triggers and there is no option to select All and disable the trigger.

    Feature to be added - Can you add a SelectAll option and disable and enable button on the trigger section to do that in one shot.

    Thanks
    Jawahar

    12 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  5. GitLab Integration in Azure Data Factory

    Will be useful to have GitLab integration in Azure Data Factory along with GitHub and Azure Repos as it's one of the most popular tools

    127 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  6. ADF SharePoint List - Allow Multiple Values

    The current version of this connector treats any SP List columns that allow multiple values as a "complex type". These are not supported in the connector, so the data in those columns are not available.

    27 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. Pipeline Dependencies in Azure Data Factory

    When using a hierarchal structure of pipelines (pipelines referring to other pipelines) in ADF it can get messy and confusing quite fast. To get a good picture of how things are put together I would love to have the ability to show visual dependencies between pipelines – just like in Power BI regarding query dependencies.

    I believe one should have the possibility to see dependencies between all the pipelines, but to minimize complexity and increase focus, a “drill-down” functionality should also be available if needed. I.e. one could have a dependency view per folder.

    Total view (All pipeline dependencies) -->…

    39 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. Add ability to customize output fields from Execute Pipeline Activity

    This request comes directly from a StackOverflow post, https://stackoverflow.com/questions/57749509/how-to-get-custom-output-from-an-executed-pipeline .
    Currently, the output from the execute pipeline activity is limited to the pipeline's name and runId of the executed pipeline, making it difficult to pass any data or settings from the executed pipeline back to the parent pipeline - for instance, if a variable is set in the child pipeline, there is no in-built way to pass this variable in the Azure Data Factory UI. There exists a couple of workarounds as detailed in the above StackOverflow post, but adding this as an inbuilt feature would greatly enhance the ability…

    709 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  11 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. Self-hosted integration runtime security; store credentials locally without flowing the credentials through Azure backend

    If the owner of the DF should not know the Self-hosted integration runtime credentials the client should use New-AzDataFactoryV2LinkedServiceEncryptedCredential that needs a connection to Azure's DF, therefor he must have credentials into the azure account that hosts the DF. The other option is to put the credentials in a key vault (managed by one or another) and exchange, again, azure credentials for access.

    There's no way to break this chain, either the client receives too much security info about the server, either the server knows too much about the client security.

    In the situation that the DF pipelines are exposed…

    24 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  10. 127 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    5 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. Managed Private Endpoints - NCUS

    Managed Private Endpoints for North Central US Region. Currently documented and available here: https://docs.microsoft.com/en-us/azure/data-factory/managed-virtual-network-private-endpoint

    Would like this to circumvent the need to whitelist ADF IP ranges and specify specific internal IP addresses inside of my network to communicate via VPN to secure internal resources.

    13 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. Please add configurable, per activity default settings

    The default timeout for Copy Activities in Azure Data Factory is 7 days.
    We are used to set this property individually to a much lower value for every Copy Activity we use.

    More generally speaking it would be great to be able to configure Default Values per Activity Type to adjust those values to local procedures.

    This way when developing new pipelines not every property would have been needed to be set individually every time again and again.

    So please provide Adjustable, Per Activity - Default Property Values.

    21 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. Common Data Service connector: Alternate key support for lookups

    Please provide support to use alternate keys to map lookups documented here: https://docs.microsoft.com/en-us/powerapps/developer/common-data-service/use-alternate-key-create-record#using-alternate-keys-to-create-an-entityreference

    Would be nice to use an alternate key defined on contact when import accounts through ADF for example

    87 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. Fault Tolerance - Skip Incompatible Columns

    I am loading data dynamically from multiple SQL Server databases. All of the databases share the same table structure. However, there may be some exceptions where one of the databases has a missing column in one of the tables.

    When such a situation occurs, I would like to set the Fault Tolerance on to Skip the incompatible column. Meaning, instead of it failing or skipping all the rows (columns), it should skip the single column instead.

    That way, instead of losing all the columns, I lose one column, which is not used anyways since it never exists.

    100 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  15. Dark mode

    Working with ADF is good experience but it can be more user friendly if ADF can support Dark mode because all white is kind of distracting.

    28 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  16. Allow multiple SQL statements to be run in Lookup Activity to Snowflake

    I want to perform multiple SQL statements in one run of Lookup activity. For example, execute the command to create a table and read the summary, like this:

    create temporary table mytemptable (id number, creation_date date);
    select * from mytemptable;

    Currently there is a limitation and I get the error message:
    ERROR [0A000] Multiple SQL statements in a single API call are not supported; use one API call per statement instead.

    13 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. Support for implementing custom user functions in Mapping DataFlow (Azure Data Factory)

    It would be great that we could implement our own functions in Mapping DataFlow.
    User Functions on Azure Databricks (https://docs.databricks.com/spark/latest/spark-sql/udf-python.html) is a similar great capability.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. ADF: If Condition exit instead of inner activity

    As inner activities are limited (no if condition in an if condition) please change the if condition activity to have either true or false exit

    4 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  19. Validating whether SHIR is encrypted via PowerShell or Portal

    Now, we can only validate whether it's SSL encrypted manually under Settings from Microsoft Integration Runtime Configuration Manager.

    It would be great to have a feature to validate this without checking Microsoft Integration Runtime Configuration Manager.

    For example, if there's a return value about the encryption, we could leverage this automate the validating process via a PowerShell script.

    10 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  20. Ability to reuse one Databricks Job Cluster for multiple Databricks activities

    Job Clusters are the prefered way to run Databricks Notebooks.

    It would be very helpful if the same Databricks Job Cluster could be used for multiple Databricks activities.

    For example for all Databricks activities in one pipeline.

    13 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
← Previous 1 3 4 5 63 64
  • Don't see your idea?

Data Factory

Categories

Feedback and Knowledge Base