Update: Microsoft will be moving away from UserVoice sites on a product-by-product basis throughout the 2021 calendar year. We will leverage 1st party solutions for customer feedback. Learn more here.

Data Factory

Azure Data Factory allows you to manage the production of trusted information by offering an easy way to create, orchestrate, and monitor data pipelines over the Hadoop ecosystem using structured, semi-structures and unstructured data sources. You can connect to your on-premises SQL Server, Azure database, tables or blobs and create data pipelines that will process the data with Hive and Pig scripting, or custom C# processing. The service offers a holistic monitoring and management experience over these pipelines, including a view of their data production and data lineage down to the source systems. The outcome of Data Factory is the transformation of raw data assets into trusted information that can be shared broadly with BI and analytics tools.

Do you have an idea, suggestion or feedback based on your experience with Azure Data Factory? We’d love to hear your thoughts.

  1. Ability to fetch keys from app-config in data factory pipeline activities

    During my project work, I came to a situation where I need to get some config values from app-config in my pipeline activities. I know that we can fetch keys from key-vaults in pipeline activities but can't do the same with app-config. Data factory should allow connectivity to app-config so that easily config values can be pulled in pipeline activities. Config values can be related to different environments which will be helpful in CI/CD process of ADF.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. Activity Name can be passed to Next Activity

    As of now there is no provision to know the name of the activity which is triggering the next activity. It would be great to have activity name passed from one activity to next, same as we can know about Pipeline name and Datafactory name

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. Hello, I suggest that in Data Factory have an option to execute specific activities like SQL Server Integration Services (SSIS).

    I mean, disable some activities to run specify the activity that the user needs in Azure Data Factory.

    I think that this option will help everybody in its projects.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Do not overwrite Customer proxy configurations when upgrading ADF/SHIR

    This is a source of production and test problems.

    When a major upgrade occurs Microsoft should migrate the existing configuration settings in diahost.exe.config & diawp.exe.config from the earlier installation, for the system tag entries or at the least provide a option to migrate configurations during the process. It's highly unlikely proxy's change day to day.

    What can happen is that an upgrade occurs and everything stops working and takes a while to address and investigate why and it can be difficult to determine if it's a upgrade, pipeline or network issue.

    By migration the configuration this will save customer and…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. Real time data migration via ADF?

    Hi is it possible to migrate real time data from SQL server and Oracle DB to ADLS via ADF?
    Considering my source data changes every second.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. Could we incorporate testing in ADF, please?

    Could we incorporate testing in ADF, please? Something like unit testing or similar which a developer can run before publishing the pipeline

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. Could we incorporate testing in ADF, please?

    Could we incorporate testing in ADF, please? Something like unit testing or similar which a developer can run before publishing the pipeline

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. Support Excel Binary (xlsb) files in ADF

    Support Excel Binary (xlsb) files in ADF

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. Add ability to query Pipeline Runs filtering on RunId

    It should be possible to query pipeline runs by factory using a list of run IDs. Run ID is not currently an accepted operand in the RunQueryFilter: https://docs.microsoft.com/en-us/rest/api/datafactory/pipelineruns/querybyfactory#runqueryfilteroperand

    This would allow us to query the details of a known list of run IDs in one call without having to individually query them. Think of a list of 100 IDs - it's the difference between 1 call vs 100 calls.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  10. "Show parameters" option in Monitor dashboard should only show ONE of each parameter

    The "Show parameters" option in the Monitor page is messed up.
    I have a pipeline with multiple parameters.
    I open the "show parameters" dialogs option and click several to be shown.
    The monitor page shows the parameters.
    Oops! I clicked one too many parameters.
    I open the "show parameters" dialogs option and un-click the parameter that I didn't want.
    Now the monitor shows 2 sets of THE SAME parameters.
    I go into the dialog and reclick the same parameters.
    Now the monitor shows 3 sets of THE SAME parameters.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. Skip weekly trigger activities of a certain weeks of a month. Example 2nd Tuesday of the month trigger wont run but other T

    The ability to skip a specific date and time for the currently configured trigger activity and not perform the trigger activity on that specific date and time. For example, we can mask the trigger activity setting that is performed every day and skip the trigger activity on the second Tuesday of every month.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. Automatically add Content-Length to Web Activity calls

    Since some web APIs require the Content-Length header in the request, and will return the status code 411 if it isn't supplied, we must add it to the header in order to make the request. This can be difficult in Data Factory, especially if the content is dynamically created.

    It would be nice if the Content-Length header (and possibly a few other standard ones) were calculated at runtime and added to the request automatically. This would be similar to how Postman works on its requests. Content-Type (since only JSON is supported) and Host could also be added automatically.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. Allow options to choose between Application server or Group (Login type) when creating Linked Service using "SAP BW via MDX" connector

    Allow options to choose between Application server or Group (Login type) when creating Linked Service using "SAP BW via MDX" connector - similar to "SAP Table" or "SAP OpenHub" connector. While creating a Linked Service using "SAP BW via MDX" connector, we only have the option for "Server Name" which indicates the Application Server.
    We had to provide a single server name in order to establish the connection. This would limit the load balancing option to connect to a group of servers.
    “SAP Table” or “SAP OpenHub” connectors allow us to choose “Logon type” during Linked Service creation, which allows…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. Enable option for automatic column mapping to ignore source columns not present in sink

    The automatic column mapping feature in Copy activities is a great idea. It allows you to have extra columns in the destination, but doesn't let you have extra columns in the source. It would be more useful if it allowed you to have extra columns in the source as well (at least as an option). That way, when staging data (particularly on tables with large numbers of columns), you can just put the columns you want to copy in the destination, without needing to modify the source query. If you later want another column, you could just add it to…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. EVERY attribute and setting needs DYNAMIC CONTENT enabled

    I can specify the Stored Procedure name dynamically, but NOT, say, the Pipeline or DataFlow name.

    EVERY property, setting, check-box, dial, switch, lever, and slider needs to support DYNAMIC CONTENT.

    Look at the list of feedback items asking for DYNAMIC CONTENT on one attribute or another.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. 1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. Support Keyboard usage in Mapping Dataflow Editor

    In the editor for the mapping dataflow scripts, there is some partial but not practical because unintuitive keyboard mapping.
    - There is no key to save the changes,
    - but the "ESC" key leaves the editor without saving. This even happens, if we have entered a search for a string and want to leave the search again.
    Solution:
    - When pressing the escape key from the search widget, do not leave the editor but only the search.
    - When pressing the escape key otherwise, and the script has been changed, ask, whether the changes should be saved (the choices should…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. Re-submit failed items in for each loop without rerunning the entire activity

    A for each loop is very commonly used to copy data in parallel. A single for-each loop can copy hundreds of tables at a time. But, if a few tables fail to copy, you need the ability to only re-run those tables and copy the failures not rerun the entire for each loop.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. Need "Error Handler" section in ADF just like we had in SSIS

    We can have an "Error Handling" section in ADF - it can allow us to configure "Throw error" and "Email", "Log Error" activities (3 new activities that ADF should have) which will allow us to re-use standard error handler templates that we create in this section across the Data Factory - this section can have generic code snippets which can be event driven so that we don't have to rely on
    1. Logic Apps to send out the emails
    2. One pipeline can only have one failure activity which is sitting out of the pipeline (instead of one connected to…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  20. Ignore Quoted Line Breaks Like Power BI

    In Power Bi when loading CSV Files there is an option to "Ignore quoted line breaks" which over comes the issue of have linebreaks in the middle of free text data , which causes column misalignment. This would really help when dealing with data from other systems. thanks

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Data Factory

Categories

Feedback and Knowledge Base