Add datasets to Data Flows that can support all the connectors found in Azure Data Factory
I have a use case where the source is SAP and the sink is ODBC based and I don't want to rest the data into any intermediate staging storage. Just let it flow through. To date, Data Flows support only 5 types of datasets when defining a source or a sink (Azure Blob Storage, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, Azure SQL Data Warehouse, and Azure SQL Database)
Keith Mescha commented
Mark if the recommendation is to use copy what is the best practice for handling failed row when doing things like upserts to Microsoft Dynamics? Would like to update the source table of what rows failed with error message so upstream system can address? Seems like only way is to stage to Blob then have another copy to pull from Blob to update source. Or am I hopefully missing something?
Siddharth Krishna commented
Need ODBC and HANA Connectors enabled in ADF Dataflows, I dont want to stage data into ADLS or Blob for transformation as it requires data movement which is not essential and should be avoided.
Mark Kromer commented
Definitely, adding more native data flow connectors to ADF is on our roadmap. The current pattern that we recommend is to construct pipelines that use Copy Activity with the 90 connectors available to ADF to stage data in Blob or ADLS for data transformation.
It would simplify my Data Factory pipeline much more if the REST API connector is available to use in Data Flow. Currently it is showing as disabled when I try to configure a REST source.