Data Factory

Azure Data Factory allows you to manage the production of trusted information by offering an easy way to create, orchestrate, and monitor data pipelines over the Hadoop ecosystem using structured, semi-structures and unstructured data sources. You can connect to your on-premises SQL Server, Azure database, tables or blobs and create data pipelines that will process the data with Hive and Pig scripting, or custom C# processing. The service offers a holistic monitoring and management experience over these pipelines, including a view of their data production and data lineage down to the source systems. The outcome of Data Factory is the transformation of raw data assets into trusted information that can be shared broadly with BI and analytics tools.

Do you have an idea, suggestion or feedback based on your experience with Azure Data Factory? We’d love to hear your thoughts.

  1. Save specific column mappings

    If you are extracting data using the MDX for SAP BW table type and need to add another column, today you'll have to redo the entire mapping.
    This is especially annoying when using MDX as it doesn't support aliases, so automatic mapping built into ADF almost never works.
    In a 19 column mapping, the current settings lead to having to map 20 columns if a new column is added.
    If one could save/bind a mapping between two columns, so that the binding survives an 'Import Schemas', that would be a huge boon to development speed.
    See the attachment for where…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. How to run parent pipeline untile some condition statify

    I have a scenario that my source data is stored in blob and it is stored in is in date hierarchical way.So the folder will in the name of date eg '2020-01-01'. I process the data in Azure data bricks. My problem is if data bricks fail for few days. After fixing this i need to run pipe line and go and fetch data from date which data bricks fail.What I am doing is that i write a custome sql query in lookup and its output which is date to data bricks as a parameter.Databrick read this date and identify…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. Please add function that can import arm template at current Data factory

    Now, when I do [Import ARM Template], it just only can deploy new Data Factory.
    Please add function that can import arm template at current Data factory.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  4. Enable the option to sort Task History

    When a task is triggered via ADF, this gives a system generated ID which we cannot control. The current task history's default is sorted by task name. There should be a better way of sorting task history, ie., sort by rundate.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. generate adf logging WHEN it happens

    There is NO information on HOW ADF writes the activity, pipeline and trigger logs!!! I have run multiple jobs and the data is never consistent. Sometimes it writes that the activity has started, in progress but will NOT log the error! Sometimes, it just does not write data at ALL (unless it only logs data every x minutes which is DUMB if it is not near real time). I just wish I knew:
    1) How it wrote the logs
    2) common messgages for ALL executions (e.g. started, in progress, failure, success) and not have sporadic ones

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. Please put links to navigate between the activity runs of different pipeline runs

    When we click the "Monitor" option for any specific pipeline, we get the list of all the previous runs during a certain period.
    Then, if we click the "View activity runs" options for a specific run, we get the details step by step run and corresponding result whether success or fail.
    In the page which appears after clicking "view activity runs", please put links to go to the activity runs of next or previous pipelines (something like links for "next" , "last" , "previous" ,"first")
    It is difficult to go back to pipelines page and then click on "view activity…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. Email notifications when a trigger or pipeline fails or succeeds

    The current email notification system is based on alerts and therefore you have to set the min/average/max/total occurrences of an event/events over a period. This means that an activation and resolution emails are generated when all I want is one simple email stating whether a trigger was successful or not.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. Allow ADF to write to ADLA Database

    Currently ADF can write to AzureDataLakeStorage as a e.g. CSV file.

    Please create functionality so that ADF can write directly to AzureDataLakeAnalytics database. For example, if I am reading from a SQL table... I can write directly to an ADLA database. Without first dropping it as a CSV.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. Let it be more flexible to changes in tables, pipelines, etc. Also, let us download all json files easily

    Let it be more flexible to changes in tables, pipelines, etc. Also, let us download all json files easily

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  10. IDE Needed for Complete Azure Portal

    One IDE should be developed which has all the detailed functionality of Azure Portal - Visual studio is ok but we need seem less working from Desktop browser has its own limitations. It should be fast and stable and a developer should be able to do all the activities within that without going to portal because for regular development and maintenance of objects its required.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. Support Loading of ContentNote for Salesforce

    Copy Activity doesn't support Loading of ContentNote for Salesforce. There should be a mapping to a physical file location. We had to build a custom C# App to do this

    https://help.salesforce.com/articleView?id=000339316&type=1&mode=1

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. 1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  13. Add Event Handling Workflows .

    If an activity in a pipeline fails . It should trigger separate workflow using which custom logging can be acheived . This can be done using On error but that's not the optimal way of doing because it would make a pipeline huge .

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. Router Activity

    an activity that routes to different links based on the condition provided in each link (dynamic links).
    else for now just 2 links (true or false for an event/condition) not (success or failure)

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. Intellisense support for dynamic content

    Right now there is no intelli-sense support for dynamic content. It would be a great addition to have in the dynamic content.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. 1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. Includes limitation in the documentation or at least a link

    I could not find the limitation of ADF V2 in its documentation. I found it on Github or in Express route documentaion.
    It would be very usefull to have in the ADF documenation as section with at least a link to limitations

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. Currently ADF isn't realtime

    Currently ADF isn't realtime, can you please help make the sync realtime so if someone makes a change in one database it appears in another.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. Missing standalone sink activity

    In the scenario you need to update a file (JSON) with a "Last Run Date" from variable (without SQL), a standalone "SINK" activity is required. Is this going to be available at some point and what would be a work-around?

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  20. Possible incompatibility between latest SSDT 2017 and SSIS Integration Runtime in Data Factory

    I downloaded the latest version of SQL Server Data Tools 2017, which has component Microsoft SQL Server Integration Services Designer 15.0.X.Y

    All my deployments to SSIS Integration Runtime (hosted using Azure SQL Server) failed when using Script Task saying "Cannot load script for execution".

    I first thought it'd be a problem in my laptop, but then I benchmarked with one of my peers, he downloaded the same SSDT version and his packages got the same error.

    Then I downloaded the earliest possible version of SSDT 2017 (15.6.X) which includes component Microsoft SQL Server Integration Services Designer 14.0.X.Y.

    Packages worked fine…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Data Factory

Categories

Feedback and Knowledge Base