The JSON editor is OK but is still a barrier to entry. A WYSIWIG UI based on SSIS/Machine Learning Studio would really make this easier to use.358 votes
Thanks for your feedback. We are working towards an authoring experience that will be very easy and intuitive for you. We will be sharing more details in the coming months.
the web and odata connectors need to add support for OAuth ASAP. Most other Microsoft services (Office 365, PWA, CRM, etc, etc, etc) along with many other industry API's require the use of OAuth. Not having this closes the door to lots of integration scenarios.252 votes
Now OData connector support AAD-based OAuth, you can try it out via copy wizard. And more details on OData connector can be found at https://azure.microsoft.com/en-us/documentation/articles/data-factory-odata-connector/. Keep this feedback live – if you need OAuth for Web, please leave comment.
Need to integrate with SAP but there is no Linked Service option for SAP.191 votes
ADF now support copying data from SAP HANA, BW, ECC/ERP and C4C.
For SAP BW, we are seeking for more feedback on your desired way of data extraction, please fill in this 5-min survey to let us know your expectation: https://forms.office.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR5nKWJAX7gJAjX4uqJHrcChUMjI2NDk3NjNOVlZEVDhCMzJXRlFXUlBUQy4u
For current support on SAP connectors, check corresponding connector articles from https://docs.microsoft.com/en-us/azure/data-factory/copy-activity-overview#supported-data-stores-and-formats
Add configurable REST and SOAP Web Service sources, so it can ingest data from other cloud services.
There are many cloud applications that expose data via a SOAP or REST api. Customers should be able to configure generic REST and SOAP data sources for use in Azure Data Factory. Other ELT and ETL tools such as Dell Boomi, Informatica, SSIS and Talend have this functionality.148 votes
Check out the ADF HTTP connector which to some degree can be used to ingest data from web service: https://docs.microsoft.com/en-us/azure/data-factory/data-factory-http-connector. Please leave comments on the additional features you need or the gap you found.
Let's enable the hottest Big Data technology of the day as a data hub. This enables an in memory ELT capability to the ADF family.81 votes
ADF Spark Activity is live now: https://docs.microsoft.com/en-us/azure/data-factory/data-factory-spark. Now only BYOC (bring your own cluster) is supported; on-demand Spark cluster will be added later.
Would be nice to have this list, and have it updated on the fly. Would save hunting through the factory to find which slices were being processed.
Also would be nice to see highlighting, possibly with some animation to indicate which pipelines are currently being ran in the diagram view.20 votes
Need your expert input to improve the performance of copy activity from our Hive Table to SQL Server using ADF Pipeline.Currently copy activity happening in Single threaded mode, it is taking 150 Mins to copy 20 Gb of data.this 20Gb data has been splitted into multiple files internally by hive, we see it hold 51 files , is there way in ADF to parallel load these files into SQL Server.
Note: Internal hive Splited files are not managed by us, it is generated automatically by Hive. File naming convention is not known inside hive table(folder) in blob.20 votes
We are enabling parallelism in the copy activity and this would benefit all sources and destinations.
Datasets can be long to describe if many table should be handled by data factory. Having a schema generator that can be based on a linked service could save a lot of time.11 votes
When using Data Management Gateway to connect to on-prem SQL, errors returned by SQL Server explaining why connections weren't successful during credential setting aren't being surfaced. This makes it hard to troubleshoot problems with Data Management Gateway.9 votes
We have already surfaced the diagnostic tab in the DMG Configuration Manager but we do intend to do more here.
It would be nice to have a sample where we can use Data Factory in an IoT scenario to get started more quickly.
I would really appreciate this!8 votes
More file formats should be allowed, could not see copy to azure blob support PDF,Word,Images formats and more others.
It would be really great if we could have some process in place to read PDF, Word, Images (unstructured data).7 votes
You can currently copy any file format via the copy activity. Simply do not provide the structure element in the dataset. But we do want to surface this in a first class manner.
Currently, a pipeline query in the portal editor can only be one line, with no syntax highlighting.
This makes it hard to read and edit, easy to introduce errors (particularly when escaping characters), and hard to spot them.
Please add syntax highlighting, and allow the query to span multiple lines (even when in a Text.Format macro).6 votes
Thanks for your feedback. We are working on an authoring experience that will allow you to use the syntax highlighting. For the query spanning multiple lines, you can store your query in your storage account and refer the path in ‘scriptpath’ parameter. This will allow your query to span multiple lines while using ‘Text.Format’.
At present you need co-admin rights to deploy. Businesses cannot give out these rights. As a subscription owner I should be able to deploy from VS as these rights give me access in the portal to create and delete!5 votes
Team are working on relaxing such constraints. Please stay tuned.
- Don't see your idea?