Update: Microsoft will be moving away from UserVoice sites on a product-by-product basis throughout the 2021 calendar year. We will leverage 1st party solutions for customer feedback. Learn more here.

Data Factory

Azure Data Factory allows you to manage the production of trusted information by offering an easy way to create, orchestrate, and monitor data pipelines over the Hadoop ecosystem using structured, semi-structures and unstructured data sources. You can connect to your on-premises SQL Server, Azure database, tables or blobs and create data pipelines that will process the data with Hive and Pig scripting, or custom C# processing. The service offers a holistic monitoring and management experience over these pipelines, including a view of their data production and data lineage down to the source systems. The outcome of Data Factory is the transformation of raw data assets into trusted information that can be shared broadly with BI and analytics tools.

Do you have an idea, suggestion or feedback based on your experience with Azure Data Factory? We’d love to hear your thoughts.

  1. Support different frequency child pipeline in a master pipeline

    In my project i have one or two jobs which runs on weekly basis while all other jobs runs daily.

    Final job depends on both daily and weekly jobs, due to this we have to manage the trigger time to accommodate the timings and provide buffers for the jobs to complete. It would be better if we give the facility to combine different frequency pipeline in order to save time.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. 1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. On PowerShell command Get-AzDataFactoryV2ActivityRun include Pipeline Id of started pipeline if Activity executes a pipeline

    When an activity is intended to execute a pipeline the activity run info from powershell should include the pipeline Id that it attempted to run

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Mapping Data Flows Dynamic Sink Folder

    In the configuration of the Sink in Data Flows, can you please allow the sink to configured in the data flow and not as part of the data set configuration.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. Allow automatic approval of Managed Private Endpoints

    Creating Private Endpoints through Azure Data Factory requires that the endpoint is manually approved. It would be great if endpoints could be automatically approved using RBAC, the same way it works when creating an endpoint through the portal.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. Auto-align Functions differently

    When working with 10+ Functions, it's not really fun horizontally scrolling through them. I wish the auto-align would place my Azure Functions in a way that made more sense for viewing lots of them.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. Monitor

    Under Monitor please provide option to "Mark as Success" a failed activity along with "Rerun".

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. Dynamic SFTP LinkedService

    Make SFTP LinkedService accept dynamic connection detials. (Every other LinkedServices as well)
    Just like SQL Server and Oracle LinkedServices, if SFTP LS can also allow dynamic values, this can eliminate the need to create separate LS for each SFTP server and also, eliminates the need to create multiple datasets and pipelines.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. 1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  10. Annotations on design canvas

    I would like to attach notes to a Pipeline step or a Data flow step that is visible on the canvas. Maybe an icon on the box in the Pipeline that pops out a visible text, or a frame around the box that has the text below or above the existing box.

    I have attached a terrible MS Paint mock up of what the two different approaches might look like.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. User Properties with dynamic content

    User Properties for an Activity would be very useful if dynamic content is supported. Currently with static content, they are no more useful than the activity name. With dynamic content it would be VERY useful to help with diagnostics.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. Allow configuring concurrency across multiple pipelines

    It's great that we can limit concurrency on an individual pipeline, but it would be really useful if there was a way to avoid specific sets of pipelines from executing concurrently, without having to put them all into some form of master pipeline.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  13. Ability to move files and maintain last modified datetime

    There really needs to be a way to move files, at least in local file systems (via self-hosted runtimes) and in Azure BLOB storage. Copy is far too slow, particularly when you're moving between on-premises folders. Copying the data to the cloud and back down again just doesn't make sense. But more importantly, when you move, there needs to be a way to maintain the last modified date. The copy/delete option that we have now means that's always lost.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. Ability to fetch keys from app-config in data factory pipeline activities

    During my project work, I came to a situation where I need to get some config values from app-config in my pipeline activities. I know that we can fetch keys from key-vaults in pipeline activities but can't do the same with app-config. Data factory should allow connectivity to app-config so that easily config values can be pulled in pipeline activities. Config values can be related to different environments which will be helpful in CI/CD process of ADF.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. Activity Name can be passed to Next Activity

    As of now there is no provision to know the name of the activity which is triggering the next activity. It would be great to have activity name passed from one activity to next, same as we can know about Pipeline name and Datafactory name

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. Hello, I suggest that in Data Factory have an option to execute specific activities like SQL Server Integration Services (SSIS).

    I mean, disable some activities to run specify the activity that the user needs in Azure Data Factory.

    I think that this option will help everybody in its projects.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. Do not overwrite Customer proxy configurations when upgrading ADF/SHIR

    This is a source of production and test problems.

    When a major upgrade occurs Microsoft should migrate the existing configuration settings in diahost.exe.config & diawp.exe.config from the earlier installation, for the system tag entries or at the least provide a option to migrate configurations during the process. It's highly unlikely proxy's change day to day.

    What can happen is that an upgrade occurs and everything stops working and takes a while to address and investigate why and it can be difficult to determine if it's a upgrade, pipeline or network issue.

    By migration the configuration this will save customer and…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. Support for JSON path in expressions.

    Currently you have "xpath" to support XML path expressions, but no JSON path expression parser. Would be helpful to have one.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. Real time data migration via ADF?

    Hi is it possible to migrate real time data from SQL server and Oracle DB to ADLS via ADF?
    Considering my source data changes every second.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  20. Could we incorporate testing in ADF, please?

    Could we incorporate testing in ADF, please? Something like unit testing or similar which a developer can run before publishing the pipeline

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Data Factory

Categories

Feedback and Knowledge Base