Data Factory

Azure Data Factory allows you to manage the production of trusted information by offering an easy way to create, orchestrate, and monitor data pipelines over the Hadoop ecosystem using structured, semi-structures and unstructured data sources. You can connect to your on-premises SQL Server, Azure database, tables or blobs and create data pipelines that will process the data with Hive and Pig scripting, or custom C# processing. The service offers a holistic monitoring and management experience over these pipelines, including a view of their data production and data lineage down to the source systems. The outcome of Data Factory is the transformation of raw data assets into trusted information that can be shared broadly with BI and analytics tools.

Do you have an idea, suggestion or feedback based on your experience with Azure Data Factory? We’d love to hear your thoughts.

  1. Make a wizard similar to the new Copy (Preview) which just allows you to type any long running SQL statement

    Need a way of just entering an adhoc SQL statement which you know will run for hours/days and have it managed by ADF.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. Azure Table Storage key field translations, JSON element naming, and rollover text

    I think it was a mistake to have a separate system for setting PartitionKey and RowKey values in the sink section, when you had a system setup in the translator area specifically for the purpose of translating fields (columns) when reading and writing entities. When I first tried to get a PartitionKey to come from another field in my source table (NewPKey in this example), I incorrectly tried to do it like this ...

    "translator": {
    "type": "TabularTranslator",
    "columnMappings": "NewPKey: PartitionKey, RowKey: RowKey, Value: Value"
    }
    ... of course, that didn't work. For logical consistency and simplicity, I wish that…

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. gantt chart mixes up seconds and minutes

    there is an error in the activity run Gantt chart. It looks like its confusing seconds with minutes.

    for example:
    Start Date 12-9-2019 14:18:28
    End Date 12-9-2019 14:18:57

    this will show the gantt chart from minute 28 until minute 57. while these are actually seconds.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Author and Monitor portal - datetime format

    Data factory v2 Author and Monitor portal - We need an option to control the locale/Region settings so that the datetime are displayed in the corresponding format (Trigger Time, Run Started, Window Start Time etc...)
    At least keep the default as ISO format. Definitely not US datetime format - this is time consuming to understand and prone to error.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. Improve Mobile device support

    Fix the ADFv2 site so that it works on a mobile device. It is not possible to easily restart an activity from a small phone screen and it is impossible to navigate the pipeline map.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. Data Factory should support parsing Fixed Length Files

    Azure Data Factory should support parsing Fixed Length Files, lots of ERP Data exports are of Fixed Length Files, not sure why wasn't this type of connector or parser not included out of box. This is a total miss from ADF Product team. Hopefully they will include this soon. Yeah Yeah there is a work around or custom coding,but why not have built-in?

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. Custom data generator as a datasource

    Why the custom data generator is not yet implemented as a datasource?

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. What does this mean? Partner account is currently not supported.

    What does it mean when you say that a Concur "Partner account is currently not supported."

    Do you or do you not support connections to SAP Concur???

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. Coying of activities or pipeline

    Can we have the feature like copying existing activities and pipelines and paste in new Pipeline.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  10. Allow WebHook-activity to use JSON-array in the request-body

    Currently the WebHook requires a JSON-body, but the validation fails on an array of json-objects.

    Example usage:
    When you want to trigger EventGrid you must send an json-array:
    https://docs.microsoft.com/en-us/azure/event-grid/post-to-custom-topic. This is now not possible. The alternative is to use the more-complex web-activity, which allows this input.

    Suggestion: allow json-arrays to be valid input.

    Note: this is for the 'simple' WebHook activity not the Web-activity.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. Use Lookup activity to populate pipeline parameter

    I have a lot of situations where I am loading many ID's from a REST API endpoint into a SQL table. Due to the nature of many API's, to get more detailed information, I then need to call an API endpoint with each ID. So, I would like to use a Lookup activity to get my list of IDs, put that in a parameter, and then have a ForEach loop pull from the parameter for each ID to do each API call. I was told this isn't possible in the UI yet - https://social.msdn.microsoft.com/Forums/azure/en-US/1fcdf9dc-1aa4-430c-b693-4aadf435643d/populating-array-parameter-from-database-for-foreach-loop-activities?forum=AzureDataFactory

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. Add a target Mapping, create automatically for dynamics

    Almost every dynamics Entity contains hundreds of columns, it's very difficult to create target sql strcuture first and build the ADF job next. In ADF job creation, Integrate the automatic target structure creation on fly. In market, almost all other dynamics connector products are supporting this feature including new columns changes detection and integration.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. Add AWS Redshift connector as Destination in ADF

    We have requirement to load data from Azure SQL server database to AWS Redshift but there is no destination connector for redshift in ADF. Its so painful to work with other alternative solution where ADF is the simplest option. I would request MS to please plan or add Redshift destination connector in ADF

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. When will ADF V2 be supported in IE

    When will ADF V2 be supported in IE

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. 1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. Provide us monitoring view with concurrent threads and I/Os with query

    Provide us monitoring view with concurrent threads and I/Os with query.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. More Has to do in Copying Recursive folders

    https://docs.microsoft.com/en-us/azure/data-factory/connector-file-system

    It has not provided when the folder structure like below one

    MainFolder1
    Folder1:
    SubFolder1
    File1
    File2
    Folder2
    SubFolder1
    File1
    File2
    File3
    Folder3
    SubFolder3
    File1
    File2
    Folder4
    SubFolder 4
    File1
    File2
    File3

    The same structure has to come at target side

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. Explain very clearly in a single sentence what this feature actually does?

    Thanks for the management talk, crossed with technical buzzwords is probably great for stroking somebodies ego.

    Except nobody really can tell what this does by the description provided. I get my boss point to this and simply say "explain", and my best guess was:
    Manage Trusted Information Easily By Creating Pipelines To My On-Premise SQL Server using C#. Where it will turn raw data into more data which can then be plugged into BI and analytics tools.

    So, why do I need this again?
    Why can't we plug our BI and analytical tools into the raw data themselves?

    MS has…

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. Self hosted integration runtime hogging memory consumption. Can we limit it?

    Self hosted integration runtime hogging memory consumption. Can we limit it?

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  20. omit computed columns in sink in column mapping of copy activity

    In our staging table definitions we have included a calculated column that yields a row hash. We use this row hash to identify any potential rows that have changed in the source.

    However, the copy data activity tries to insert a NULL value to this column, generating an error and preventing us from using a row hash.

    Could you check for calculated columns in the sink and omit these columns when inserting data into the sink?

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Data Factory

Categories

Feedback and Knowledge Base