Data Factory

Azure Data Factory allows you to manage the production of trusted information by offering an easy way to create, orchestrate, and monitor data pipelines over the Hadoop ecosystem using structured, semi-structures and unstructured data sources. You can connect to your on-premises SQL Server, Azure database, tables or blobs and create data pipelines that will process the data with Hive and Pig scripting, or custom C# processing. The service offers a holistic monitoring and management experience over these pipelines, including a view of their data production and data lineage down to the source systems. The outcome of Data Factory is the transformation of raw data assets into trusted information that can be shared broadly with BI and analytics tools.

Do you have an idea, suggestion or feedback based on your experience with Azure Data Factory? We’d love to hear your thoughts.

  1. Data Factory activity output deciding the "color" of the entire pipeline status

    In Azure Data Factory pipeline, you can have only 2 status: OK (Green) or Failed (Red).

    Would it be possible having a third status, like Partially Succeeded (Orange) based on some custom outputs.

    The idea comes from this discussion: https://github.com/MicrosoftDocs/azure-docs/issues/31281

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. Make a wizard similar to the new Copy (Preview) which just allows you to type any long running SQL statement

    Need a way of just entering an adhoc SQL statement which you know will run for hours/days and have it managed by ADF.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. Azure Table Storage key field translations, JSON element naming, and rollover text

    I think it was a mistake to have a separate system for setting PartitionKey and RowKey values in the sink section, when you had a system setup in the translator area specifically for the purpose of translating fields (columns) when reading and writing entities. When I first tried to get a PartitionKey to come from another field in my source table (NewPKey in this example), I incorrectly tried to do it like this ...

    "translator": {
    "type": "TabularTranslator",
    "columnMappings": "NewPKey: PartitionKey, RowKey: RowKey, Value: Value"
    }
    ... of course, that didn't work. For logical consistency and simplicity, I wish that…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Unable to connect Azure SQL Managed Instance to ADF through AutoResolved IR using private end point

    In ADF data flows, we can only use sources having AutoResolved IR connectivity.

    But we are unable to connect Azure SQL Managed Instance through AutoResolveIR using private end point.

    We can't use public end point as data will expose to outside network and for security reasons.

    Also your new feature Managed Private Endpoints in ADF is not launch yet.

    suggest a solution how to connect Azure SQL to ADF using private end point through default IR.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. PowerShell commnad or API to publish pipelines

    I couldn't found any method to publish pipeline changes to a collaboration branch except the Azure portal.

    If we have that one, could you please provide us the documentation? If not please consider this as a request and implement it for automation.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. Support the ability to pass an array to an IN clause inside a copy activity

    It would be very helpful if it is possible to use the array results of a lookup activity and use the results inside an IN clause of a copy activity query. Right now the array format that is returned by the lookup activity cannot by used inside the query since the array is returned in a JSON format.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. gantt chart mixes up seconds and minutes

    there is an error in the activity run Gantt chart. It looks like its confusing seconds with minutes.

    for example:
    Start Date 12-9-2019 14:18:28
    End Date 12-9-2019 14:18:57

    this will show the gantt chart from minute 28 until minute 57. while these are actually seconds.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. We need to choose a table even when we don't want a table

    This seems to be a UI bug:

    When using a data source to a database, such as Synapse Analytics or other databases, we need to create the linked service, the dataset and we need to specify a table on the dataset. Only after this we can, on the pipeline activity, specify we would like to use a query instead of the table.

    The table on the dataset will have no use and I'm choosing any table not related to the data factory work.

    The UI should allow us to create a dataset based on a query, not only specify the…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. Improve Mobile device support

    Fix the ADFv2 site so that it works on a mobile device. It is not possible to easily restart an activity from a small phone screen and it is impossible to navigate the pipeline map.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  10. powerbi

    Able to connect and import PowerBI datasets in workspace.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. Enable disable toggle option on Activities

    Can you please add a toggle option to enable/disable an activity in ADF pipelines. Similar to one available in SSIS dtsx package.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. Improve coding experience in pipeline activities

    When coding computed expressions in a dataflow the experience is tolerable, but the same cannot be said for pipeline activities. Here you're limited to a very small window, plus everything is functional so that simple expressions become extremely convoluted. It would be a huge improvement to allow the same boolean syntax as in dataflows, so you can iif(x>0 && y == 1) instead of iif(and(greater(x,0),equals(y,1)))

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. powerbi

    Able to refresh PowerBI datasets in workspace.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. How Customers can perform Security tests and validate the Customer Managed Keys and Microsoft Managed Keys are really working.

    Presently there is no option available to perform Security Tests and Gather evidences for usage of CMK for Azure Data Factory.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. Custom data generator as a datasource

    Why the custom data generator is not yet implemented as a datasource?

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. Allow reusable components

    It should be possible to create reusable components. There are two cases for this:

    1) We use a specialized id generation based on row fields. The actual fields used and the number of fields used is dependent on the data, so it must be customizable

    2) We have some fields that we would like to add to all our tables. It would be nice to able to define some global derived columns that we could add - these should be parameterizable so that they can be combined with 1)

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. What does this mean? Partner account is currently not supported.

    What does it mean when you say that a Concur "Partner account is currently not supported."

    Do you or do you not support connections to SAP Concur???

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. Coying of activities or pipeline

    Can we have the feature like copying existing activities and pipelines and paste in new Pipeline.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  19. Ability to have an Alert setup for a Pipeline that is running too long or hung up for some reason

    We have been having issues were a pipeline that we currently have setup keeps running. The job in question normally takes between 14 and 16 minutes to complete. However every once in awhile it just keeps running and never seems to finish. So far the longest it has gone with no one noticing it, on the IT team, has been ~20 hours and it was not until an end user of the data noticed that it was not updated that we were able to go in and fix the issue. Being that it was still running there was no failure,…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  20. Fix issue with copy data activity working in Debug but not with triggers

    I have a parametrized pipeline that uses the Copy Data activity, with the Source being OData and the Sink is an on-prem SQL server. They're being executed with an self-hosted integration runtime.

    The pipeline was working successfully until last week (May 8th 2020) or so, copying data from 32 tables. Now, I'm having issues with 2 tables. They work when I execute the pipeline in debug mode, but when triggering the execution, with the same parameters, it fails. There are no changes to be published, I'm aware that the trigger executes the pipeline version published while debug executes it with…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Data Factory

Categories

Feedback and Knowledge Base