Update: Microsoft will be moving away from UserVoice sites on a product-by-product basis throughout the 2021 calendar year. We will leverage 1st party solutions for customer feedback. Learn more here.

Data Factory

Azure Data Factory allows you to manage the production of trusted information by offering an easy way to create, orchestrate, and monitor data pipelines over the Hadoop ecosystem using structured, semi-structures and unstructured data sources. You can connect to your on-premises SQL Server, Azure database, tables or blobs and create data pipelines that will process the data with Hive and Pig scripting, or custom C# processing. The service offers a holistic monitoring and management experience over these pipelines, including a view of their data production and data lineage down to the source systems. The outcome of Data Factory is the transformation of raw data assets into trusted information that can be shared broadly with BI and analytics tools.

Do you have an idea, suggestion or feedback based on your experience with Azure Data Factory? We’d love to hear your thoughts.

  1. allow to resume pipeline from the point of failure

    I created a master pipeline that execute other child pipelines. If there is an error in one of the child pipelines, and I want to fix the issue and rerun (resume is not available) failed child pipeline the parent pipeline doesn't resume. I have to rerun the the parent from the very beginning. Which forces me to reload all the data from source to stage and then from stage to EDW.
    This is really ridiculous. At least show the activities in the monitor that were scheduled but didn't run due to a child pipeline failure and allow us to manually…

    60 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  2. Support for Dynamic Content with pagination rule in Rest source while Copy Data activity

    I'm using pagination rule in copy data activity from rest endpoint to blob storage.

    I have applied pagination rule with dynamic value like this:

    AbsoluteUrl = @replace('$.nextLink','oldbaseurl','newbaseurl')

    But it is still accessing the oldbaseurl which is I'm getting in response, doesn't replace it with new through function.

    Error in data factory:

    ErrorCode=UserErrorFailToReadFromRestResource,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=An error occurred while sending the request.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.Http.HttpRequestException,Message=An error occurred while sending the request.,Source=mscorlib,''Type=System.Net.WebException,Message=Unable to connect to the remote server,Source=System,''Type=System.Net.Sockets.SocketException,Message=A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has…

    60 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  3. Configure for singleton Pipeline Run

    For wall clock trigger schedule, should have some property by which we can control whether to allow new run of pipeline if a previous run already in progress.

    59 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Support response headers and cookies as output from Http Connector

    I'm trying to use the HTTP Linked Service to authenticate to a SOAP service then subsequently retrieve data from the same service. I do not have access to change the SOAP service as it is a thirdparty service. After successfully authenticating, the SOAP service sets a Session Cookie with an authorization token. How can I access the cookie (as an output) in the response header, from the WebActivity that I used to perform the authentication? I would then like to use the output in CopyDataActivity to set an additional header with that authorization token output from the previous WebActivity. Does…

    59 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    5 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. Remove output limitations on Web and Azure Function activities

    Currently if you make a call to a web API and the JObject returned is greater than 1mb in size then the activity fails with the error:

    "The length of execution ouput is over limit (around 1M currently). "

    This is a big limitation and would be great if it were removed or increased.

    59 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    4 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. Implement a .NET SDK to publish pipeline code to ADF enabled with DevOps GIT

    Request to add an API which can publish code directly to DevOps GIT branch in ADF. Currently client.Pipelines.CreateOrUpdate() API will publish the pipeline code to ADF repo but since we are working on automation projects now it will be great if a new api will be introduced which can publish the code directly to their respective GIT branch in ADF which is currently missing.

    Regards,
    Nihar

    59 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. Add retry policy to webhook activity

    Right now it is not possible to retry a Webhook activity. Sometimes these activities fail due 'bad request' or other issues that can easily retried by manually re-running it. However this is so far manual.

    58 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  8. SharePoint List Destination

    Would like a target destination of a sharepoint list.

    57 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  9. Web Activity and Rest Connector OAuth support

    The usefulness of the Web Activity and the REST Connector are hamstrung without OAuth support for authentication. Many 3rd party services require this to consume.

    57 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  10. Allow Debugging of Individual Activities in ADFv2

    In the ADFv2 GUI, clicking debug runs the entire pipeline. I would like the option to run only one activity, or multiple selected activities in the pipeline via debug. In the meantime, I have been creating a test pipeline and copying activities there for debugging, but this work-around wastes time.

    56 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. Function App activity widget MSI Authentication support

    In the new Function App activity widget, we need the support for MSI Authentication between ADF and the function it is calling, when anonymous access to function is disabled. Same way it's available in Web activity widget, that we currently use to call the Function App.

    56 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  12. Project Online support in ADFv2

    Customer needs to connect Project Online in ADF v2, currently is is not possible to work this out using ADF v2 OData connector.

    55 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    6 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. Better error logging and debug options for Dataflow

    There is no precise error logging for dataflow for few scenarios. I am using a simple single source file that is split into multiple files based on number of rows in the file. Sometimes i get a error called

    'Error: Dataflow execution failed due to user configuration error'

    The same code seems to run fine in other environments. So it is getting almost impossible to backtrack the error and fix. Any help would be appreciated.

    55 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    4 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. Service Bus Activity

    Add an activity to post a service bus message. This would greatly expand interoperability.

    54 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. Talk to the O365 OneRM team and get a copy of their O365DataTransfer program -- it does a lot of things DF needs to do.

    O365 OneRM team is solving the same problems that you are, and has a mature platform that does many of the things that Azure DF will need to do. Talk to Zach, Naveen, and Karthik in building 2. Also talk to Pramod. It'll accelerate you in terms of battle-hardened user needs and things pipeline automation has needed to do in the field. You'll want to get a copy of the bits/code.

    53 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. Add Update function to Salesforce using Data factory

    The new V2 Azure data factory supports only Insert/Upsert transactions to salesforce. Can we expect support for update transactions as well ?

    Thanks!

    53 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    4 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. Dark mode

    Working with ADF is good experience but it can be more user friendly if ADF can support Dark mode because all white is kind of distracting.

    53 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. Output files to FTP Server

    When output is ready, it should be possible to save the files to an FTP folder i.e. FTP becomes an output set as well.

    52 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    5 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. Allow for scheduling & running Azure Batch with Docker Containers through Azure Data Factory

    Currently it isn't possible to schedule or trigger Azure Batch with Docker Containers in Azure Data Factory (it's only possible if you use VMs on Azure Batch).
    Azure Data Factory would be a stronger product if it support this as currently one needs to set-up other scheduling to trigger Azure Batch running Dockers (e.g., Apache Airflow)

    Forum link: https://github.com/MicrosoftDocs/azure-docs/issues/16473

    52 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  20. Support pulling storage account key from Azure Key Vault (not from a secret)

    When you setup Key Vault to periodically rotate the storage account key, it stores the key not in a secret but under a URI similar to https://<keyvault>.vault.azure.net/storage/<storageaccountname>

    The setup instructions for this automatic key rotation are here:
    https://docs.microsoft.com/en-us/azure/key-vault/key-vault-ovw-storage-keys#manage-storage-account-keys

    Please enhance Azure Data Factory so that you can pull the storage account key for use in a linked service from this place in Azure Key Vault. Currently ADF only supports pulling from secrets, not from storage keys in key vault.

    52 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Data Factory

Categories

Feedback and Knowledge Base