Data Factory

Azure Data Factory allows you to manage the production of trusted information by offering an easy way to create, orchestrate, and monitor data pipelines over the Hadoop ecosystem using structured, semi-structures and unstructured data sources. You can connect to your on-premises SQL Server, Azure database, tables or blobs and create data pipelines that will process the data with Hive and Pig scripting, or custom C# processing. The service offers a holistic monitoring and management experience over these pipelines, including a view of their data production and data lineage down to the source systems. The outcome of Data Factory is the transformation of raw data assets into trusted information that can be shared broadly with BI and analytics tools.

Do you have an idea, suggestion or feedback based on your experience with Azure Data Factory? We’d love to hear your thoughts.

  1. Show detailed code samples.

    Whe have documentation / samples how to create and run pipelines using C#, however, whe don't have information of how add translator / mappings to.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. Freeze parts of data factory, so debug runs of one/more partial pipelines (start anywhere).

    If I want to debug middle activities of a long pipeline, why start from the beginning? Offer me a way to provide begin-state for the activity where I want to start debugging

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. The Xero connector doesn't appear to return data for the Employee schema

    The Xero connector doesn't appear to return data for the Employee schema. It does allow me to select the Employee scheme ("Minimal"."Employee" or "Employee") and it does return the column names but no data is returned.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. batch_count

    batch_count argument is not available on the ForEach Class in the Python SDK.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. choose custom column separator when using 'enable staging' in copy activity

    We have a number of data sources that contain quotes etc... and when using the copy activity and selecting the 'enable staging' and specifying a storage account and path this causes the copy to fail with an error of...

    'found more columns than expected'

    It would be good if we could specify a custom separator to stop this problem

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. The mapper is built backwards

    If I'm trying to build a table for a stored procedure, it lays out every field on my input side but the format that I MUST build is on the output side. Start with the output side and let me fill in the input side. I could have multiple days on this one project alone. EVERY other tool that I've ever used is easier than this.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. Expand Pipeline Activity For Contained Items

    I would like the ability to expand a pipeline activity for the activities within that pipeline. This would allow for easily copying an existing pipeline into a new one without having to copy activities individually. It would improve the speed for creating loops around existing pipelines.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. Allow use of all connectors as destination (not just source)

    Allow use of all connectors as destination (not just source) - source connector list is much larger than destination choices. This would enable a lot of various use cases for ADF (even to be used along with other integration platforms as well) as another toolset to expand functionality

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. "azure data factory"

    Please allow the creation of the pipeline with the Hve activity using dummy inputs and outputs.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  10. Generate Alert at Activity level

    I am not sure this feature is already available or not, but I am looking for monitoring based on Activity level. Also differentiate alert for different condition i.e. Data related error or connection related error (based on the error would like to trigger alert to specific team). Is this possible currently? if Yes then can we have some example?

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. Linking ADO Git Repo to ADF - Artifacts not displayed

    When linking an ADO Git Repo to ADF, default values are correctly populated, but when confirming the link, no artifacts are shown. When unlinking the Git Repo, and then linking the same Git Repo using the "Select a different directory" checkbox, the artifacts are correctly shown.

    It would be nice if they were correctly shown without needing to unlink and then re-link the repo.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. Lookup

    My idea is to have LookUp component working properly at least with basic functionality.
    Using Azure Data Lake Blob storage as source the component is crashing with below error message
    "Activity lkLoadAllFilesNames failed: "

    How is possible to find out what is happening and is not wot working?

    Regards

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  13. need indexOf(pattern) support and replace(fromIndex, toIndex) support

    Sometimes we need to find and replace/remove pieces of string from certain index. It's a core function to most of the languages. Would be awesome to get support for this function in the expressions builder

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. Snowflake connector support for both Azure and AWS instances

    Provide support to use the Snowflake connector to access both AWS and Azure instances of snowflake.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. Add connector for OnDrive

    Add connector for OnDrive

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. 1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. sFTP size limitation

    There is nothing in the documentation of any size limit for transferring files via sFTP. I am encountering an issue with a file of 500MB.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. ADF copy data activity with dynamic source databaset.

    ADF currently supports the various fields including dynamic mapping and as dynamic content in copy data activity.
    Looking for the support of Source Dataset field also as an dynamic content. So the source can also be decided during runtime /execution.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. ADF v2 allow TableStorage querry to be passed with quotes

    In ADFV2 when storage table is used as a dataset and choosen to write a querry instead entire table, the quote in the querry to storage table does not accept it. for eq PartitionKey gt 'abcdfdsg' or PartitionKey gt "abcdfdsg" does not seems to be accepted.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  20. 1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Data Factory

Categories

Feedback and Knowledge Base