Update: Microsoft will be moving away from UserVoice sites on a product-by-product basis throughout the 2021 calendar year. We will leverage 1st party solutions for customer feedback. Learn more here.

Data Factory

Azure Data Factory allows you to manage the production of trusted information by offering an easy way to create, orchestrate, and monitor data pipelines over the Hadoop ecosystem using structured, semi-structures and unstructured data sources. You can connect to your on-premises SQL Server, Azure database, tables or blobs and create data pipelines that will process the data with Hive and Pig scripting, or custom C# processing. The service offers a holistic monitoring and management experience over these pipelines, including a view of their data production and data lineage down to the source systems. The outcome of Data Factory is the transformation of raw data assets into trusted information that can be shared broadly with BI and analytics tools.

Do you have an idea, suggestion or feedback based on your experience with Azure Data Factory? We’d love to hear your thoughts.

  1. add activity input as variable

    I would like to query the activity Input as a variable, similar to the Output. Please add the Input Json as seen in the activity tab to the activity variable @activity('aCopy'). This would ease the logging process when the input contains a lot of variables.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. support parameterization in Snowflake linked service

    Currently snowflake is not listed in https://docs.microsoft.com/en-us/azure/data-factory/parameterize-linked-services#supported-linked-service-types

    We have to hard code database and warehouse name. Hope to have the functionality to parameterize snowflake connector just like others.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. adf blob storage trigger does not trigger

    I have an event trigger - based on a blob event - that does not fire when a file is programmatically (via ADF/Azure itself) modified (created, updated/appended) but works when I manually drop a file into the folder via Azure Storage Explorer.

    I have tried using both "versions" of storage linked service in ADF (ADLS Gen2 vs Blob).

    I have ensured there is not >1 "event" on the storage for that container.

    I have tried changing the linked service options (ignore empty blob, begins with, ends with etc.)

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Please block space from safe character list of parameter names or fix backend to support such a naming

    Looks spaces on dataset parameters will cause the following errors:

    ErrorCode=InvalidTemplate, ErrorMessage=Unable to parse expression 'body('ActionNameHereComposeRuntimeVariables')?.datasetname_here8ab51b992aa348d4af0eed67bc19a7ba.parameter with space here'

    From the error message we can clearly see that spaces on parameters won't work with that code.

    The backend code should be fixed or
    1. Problems with spaces must be documented
    2. If possible blocked in the UI or at least give a warning if spaces are used as parameter names

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. Allow processing of files that have an O365 sensitivity label

    I'm currently blocked from processing an excel workbook that has a O365 Sensitivity Label /AIP label applied to it, in a Copy Activity. please allow the specification of a service principal so that the processing of protected files is an option

    I get the following error when trying to get data from a protected file:

    ErrorCode=EncryptedExcelIsNotSupported,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Encrypted excel file 'Protected_file.xlsx' is not supported, please remove its password.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=ICSharpCode.SharpZipLib.Zip.ZipException,Message=Wrong Local header signature: 0xE011CFD0,Source=ICSharpCode.SharpZipLib,'

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. Automatic Pause SSIS runtime

    The paas databases (sql db) go into pause mode when not used. It would be nice if the ssis runtime could also automatically stop and resume when there are new runs of the packages.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. Need to stream video from Amazon S3 to azure media service

    Hi Team, Need your support for media service on demand streaming functionality. We have approx 300TB of data which is as of now stored on some cloud storage and it's very hard to move the entire data from there to azure storage so we are looking for some plugin or way by which we can request the data from there on basis of demand and then stream using azure media service.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. Switch data factories from the ADF portal

    We started using databricks recently, and the thing I really enjoy is the ability to switch workspaces from the databricks portal itself. Our team uses multiple data factories, and sometimes the work goes across data factories. To switch between the data factories, I have to either open 2 separate tabs from Azure portal when I start the investigation, or I have to go back to Azure portal and open the different data factory from the Azure portal. It will be very useful, if we can switch data factories form the ADF portal itself.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. Allow a source column in one copy activity to be mapped to multiple target columns

    Currently - mappings in ADF don't allow 1 source columns to be maped to multiple target columns. This becomes tricky when we have to re-use one source column.

    Please consider

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  10. Activity to delete a record in a table

    It is currently not possible to delete records from a table using the pipeline.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. How can we have a convenient code review process related with Azure Data factory?

    In a PR including pipeline/data flow json change, it is extremely difficult for a PR reviewer to understand what has been changed in a visual way.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. Idea to support self-hosted IR as target and source in data flow

    In order to support any on-premise database accessed through self-hosted run time in data flows, the follow design change can be done to enable it

    1. Currently on-premise database can be used as source or target by using copy activity along with data flow
    2. If the user chooses either source or target to be a database on-premise then the tool should prompt for a stage layer (ADLS or Azure SQL database ) and automatically create the required copy activity in the pipeline along with data flow

    All that i am suggesting is that the same copy activity to be generate based…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. Include ODBC connector in Data Flow

    Hello,

    it would be great if the Azure Data Factory will also support a connector for ODBC in the Data Flow.

    Thank you very much!

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. D365 Alternate Keys capability for lookup fields

    For D365 Sink, enable population of lookup fields using alternate keys

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. Make ADF use all_tab_partitions / user_tab_partitons instead of dba_tab_partitions for reading data from oracle using ADF

    While working with Microsoft on a issue , adf was not able to read the data from Oracle source using alltabpartitions / usertabpartitions and instead it was looking for dbatabpartitions to read the information related to partition tables. This is something which should be there and should strictly use alltabpartitions to access the partitioned table related information.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. Allow dynamic content for the waitOnCompletion property of the Execute Pipeline activity

    There are various reasons to execute pipelines with the Wait on Completion set to either true or false. E.g. running multiple pipelines parallel or sequentially. Or wait on slower environments and do not wait on faster environments.
    A user interface for this would be nice and consistent, but for me it would suffice if it can be set in the Json with this: "waitOnCompletion": { "value": "@variables('varWaitOnCompletion-bln)", "type": "Expression" }.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. Azure Data Factory - Web Activity - maximize supported output response payload

    In Microsoft document https://docs.microsoft.com/en-us/azure/data-factory/control-flow-web-activity, they have written down the note regarding "The maximum supported output response payload size is 4 MB."
    But when we receive response data more than 4 MB and pagination is also time consuming then what can we do to increase the response payload?

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. Data Factory Copy Activity for DynamicsCRMSink Forcing Alternate Key Use

    As per documentation it says use of alternate key in D365 CRMSink copy activity is not mandatory. But when I am using the activity the validation message keeps popping up and asking to define an alternate key.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. Azure Data Factory's FTP activity using SIZE FTP command prior to get files

    Looks as though Azure Data Factory activity dialog starts with a "SIZE" FTP command prior to copy remote FTP files...
    This may appear necessary to handle remote file size and use best options to download it, still, for some FTP servers (some banks for example), remote users are not allowed to issue "SIZE" command and ADF copy activity fails (despite it does list files when testing).
    Could the initial "SIZE" command be an option for the ADF activity?

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  20. Pipeline runs filter by parameter set to specific value.

    We use the same pipeline for many different states and licenses. Many time we have issue with a particular states. If we could filter by our parameter "Region" set to a specific value like "MI" that would be very beneficial.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Data Factory

Categories

Feedback and Knowledge Base