source and sink.89 votes
Related to this, we are now working on enabling Azure Search as a copy destination in ADF. If you are interested in this, please leave comment here and we will contact you.
Provide the ability to publish Azure Data Factory ARM templates to a custom folder in the publish branch. An additional property could be added to the publish_config.json file in order to cater for this e.g.
Please provide support to use alternate keys to map lookups documented here: https://docs.microsoft.com/en-us/powerapps/developer/common-data-service/use-alternate-key-create-record#using-alternate-keys-to-create-an-entityreference
Would be nice to use an alternate key defined on contact when import accounts through ADF for example87 votes
There should be a option to clear old errors.
When there is no pipeline that produces or consumes a data slice, and this slice has errors the counter still shows "current" errors, and this is not the case. I would like to remove these unused slices and their errors.82 votes
Can it consume or push data to Azure app service API? Supporting Swagger API.79 votes
Thank you for the feedback. We will look into this.
It would be great and very useful in my opinion if there was a Google Sheets connector.
Thanks in advance.79 votes
Some Azure REST APIs and other third parties APIs use the PATCH method.
Please add support for this method or make the method parameter a string so that we can use any method.73 votes
The copy performance of the ADF Copy Data Activity going from a file system source to a Blob FileSystem or Blob source is quite slow and CPU intensive relative to other copy mechanisms available when copying a large number (tens of thousands to millions) of small files (<1MB).
Both AzCopy & Azure Storage Explorer are able to complete the copy operations from the same source to the same sink approximately 3-5x faster while using less CPU than the ADF Copy Activity.
At a minimum, we would like to see performance parity with AzCopy / Azure Storage Explorer.72 votes
REST API pagination needs to support RFC 5988 style links in the header.
Examples are ServiceNow and Greenhouse.
See: https://tools.ietf.org/html/rfc5988#page-6 for RFC
See: https://stackoverflow.com/questions/54589413/azure-data-factory-rest-api-to-service-now-pagination-issue for a related stack overflow question
Greenhouse link header example:
link →<https://harvest.greenhouse.io/v1/applications?page=2&perpage=100>; rel="next",<https://harvest.greenhouse.io/v1/applications?page=129&perpage=100>; rel="last"
Need to grab the 'next' url which is not currently possible with pagination support:
Only way around this seems to be go outside data factory to fetch the data (e.g. databricks python) which defeats the purpose.72 votes
The existing SFTP connector in Azure data factory is not supporting multi-factor authentication. The connector supports either password based authentication or key based authentication. Enterprises are moving towards multi-factor authentication requiring both key and password for SFTP. This is a must have feature given the information security focus.72 votes
ADF must support Elastic database transactions towards Azure SQL Database.
This is equivalent to the on-premise scenario, where SSIS transactions use MSDTC towards SQL Server.
Currently, if you set TransactionOption=Required on a data flow, and use an OLEDB connection to an Azure SQL Database, you receive an error like:
"The SSIS runtime has failed to enlist the OLE DB connection in a distributed transaction with error 0x80070057 "The parameter is incorrect".70 votes
When a Web Activity calls an API that returns a JSON array as the response we get an error that says "Response Content is not a valid JObject". Please support JSON arrays as the top level of the response.68 votes
An ADLS (gen 1) Linked Service is authenticated with a Managed Identity (MSI) or a Service Principal. When authenticating with MSI, we can't use Mapping Data Flows. Will this functionality be added?67 votes
Add support for Apache Kafka Producer, Consumer, streams and KSQL API's into Azure Data Factory65 votes
In Azure Data Factory pipeline level alerts are required,Pipeline may have many activities, But single alert email should come once execute
In Azure Data Factory pipeline level alerts are required,Pipeline may have many activities(Since activity level alerts are available now . mailbox will be filled with alert emails) , So single alert email should come once the pipeline is executed61 votes
As with SSIS, it would be good to add the function to right click / add annotation. It is handy to leave notes about some development that needs to be done, warning about executions or just comments over the areas of the pipeline.60 votes
It will be great if Azure data factory jobs could be executed/run by webhooks using their default schedule.
The current limitation to re-run via powershell or Azure portal is not that graceful for production environment and to be automated.
Ideally, if the job could run on http post to the webhook will be great! and will resolve many automation challenges.
Potentially this could be integrated in Azure Logic Apps.60 votes
The documentation says that white space in column name is not supported for parquet files, but I would like to suggest implementing this feature. When too many tables are copied at once it is difficult to handle case by case because Data Factory does not support white space in column name for parquet files.
Check attached file for details.
For wall clock trigger schedule, should have some property by which we can control whether to allow new run of pipeline if a previous run already in progress.59 votes
- Don't see your idea?