Currently Visual Studio 2017 does not support Azure Data Factory projects.
Despite the Azure SDK now being included in VS2017 with all other services the ADF project files aren't.
Can you please include this feature so developers can upgrade from VS2015?
It is not currently possible to identify the IP Address of the DF, which you need for firewall rules, including Azure SQL Server firewall....1,172 votes
Thank you for your suggestion. We understand it is super important to whitelist specific IP list for ADF as part of firewall rules and avoid opening network access to all. We are working on this with high priority. Once this is ready, we will also add ADF to the list of “Trusted Azure service” for Azure Storage and Azure SQL DB/DW.
Support pushing data into SFTP in copy activity.624 votes
Thanks all for the overwhelming feedback for adding SFTP as sink as a native capability. We are actively planning for this and will keep you updated in upcoming months!
Power Query is awesome! It would be a great feature to be able to output its result into either a SQL database or Azure (Storage or SQL).447 votes
Please check the new capability we recently unveiled called Wrangling Data Flows, available in preview! Wrangling Data Flow allows you to discover and explore your data using the familiar Power Query Online mashup editor to do data preparation, and then execute at scale using Spark runtime.
Sign up for preview access at: https://forms.office.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR9-OHbkcd7NIvtztVhbGIU9UNk5QM0dSWkFDSkFPUlowTFJMRVZUUUZGRi4u and check out more details at https://aka.ms/wranglingdfdocs
There has to be support for automated testing of Azure Data Factory pipelines - perhaps as part of Visual Studio ADF project suite.324 votes
Add excel file as source.313 votes
Thank you for your feedback. We are evaluating this ask and will keep everyone posted on future plan for this.
Some customers have the necessity to extract information from Google Analytics in order to create a data lake or sql dw to gather marketing insights mixing another kind of data.
Now we have some custom SSIS packages that are paid or developing some custom code.
Or if it is not possible in Azure Data Factory, could have anoter way to extract this data in native connector in Azure … maybe Logic Apps266 votes
Thank you for your feedback. We will evaluate this ask as part of the product roadmap.
Please allow setting a certain activity to enable or disabled, pretty much like you can do in SSIS.
This is important when you are developing and only want to execute a certain part of the pipeline for example263 votes
the web and odata connectors need to add support for OAuth ASAP. Most other Microsoft services (Office 365, PWA, CRM, etc, etc, etc) along with many other industry API's require the use of OAuth. Not having this closes the door to lots of integration scenarios.261 votes
Now OData connector support AAD-based OAuth, you can try it out via copy wizard. And more details on OData connector can be found at https://azure.microsoft.com/en-us/documentation/articles/data-factory-odata-connector/. Keep this feedback live – if you need OAuth for Web, please leave comment.
can we have a copy activity for XML files, along with validating schema of an XML file against XSD.. this would be helpful.. if schema validation is success then copy else fail the activity.. this will be useful for below scenarios..
1. Blob to Blob
2. Blob to SQL
3. SQL to Blob
if all above can work with specified schema that would be great...247 votes
Thank you for your feedback. We will evaluate this ask as part of product roadmap.
We need to use bitbucket for a project. We are mirroring our azure devops repo with the pipelines to bitbucket. It would be easier if there was integration with bitbucket.204 votes
Thank you for your feedback. We will evaluate this ask as part of our product roadmap.
Provide the capability to copy data from Blob to Snowflake data warehouse182 votes
Currently in Azure Data Factory, there is no functionality to restart an entire Pipeline. If we need to refresh a dataset in Azure, all associated activities in the pipeline will have to be selected and run separately. Can we have an option where we could run the entire pipeline if required.172 votes
As of now and according to the official documentation https://docs.microsoft.com/en-us/azure/azure-subscription-service-limits#data-factory-limits there is Max activities per pipeline limit of (30), it'd be nice to increase this number to at least 500162 votes
Please add support to run a Powershell Script as an activity inside the Azure DataFactory. It will help developers to break most of the shorting coming with scripting.161 votes
Now that we have Azure functions available that can help make batch data processing more real-time, it would be great to be able to programmatically invoke the workflow component of ADF, for immediate execution of the pipeline.153 votes
Currently slices run in UST, and if data sources are in other timezones, a simple DATEADD on the where clause will cause missed data when there is a DST chance.
Additionally, adding DATEADD on every source is error prone, especially if a server changes their timezone in the future
Allow us to set the timezone either on a pipeline level, a linked service level or a Dataset level -- any of these would do as long as ADF transparently translates SliceSlart and SliceEnd to the appropriate timezone149 votes
Azure Data Factory pipeline activity to refresh Azure analysis services cube partitions.148 votes
For now check and use the custom activity sample from https://github.com/Azure/Azure-DataFactory/tree/master/Samples/AzureAnalysisServicesProcessSample and this is also mentioned in the AS GA blog (https://azure.microsoft.com/en-us/blog/announcing-azure-analysis-services-general-availability/). We are further evaluating the future native 1st-class support.
We have activity dependencies today, but they are always logical AND. If we have Activity1 -> Activity 2 -> Activity3 and we want to say if any of these activities fail, run activity 4, it isn't straight forward. In SSIS, we can choose an expression and choose whether we need one or all conditions to be true when there are multiple constraints. We need similar functionality here. It can be achieved with a bit of creativity (repeat the failure activity as the single failure path after each of the original activities use the If Condition to write logic that would denote failure in a previous activity), but it's more convoluted than it needs to be.
We have activity dependencies today, but they are always logical AND. If we have Activity1 -> Activity 2 -> Activity3 and we want to say if any of these activities fail, run activity 4, it isn't straight forward. In SSIS, we can choose an expression and choose whether we need one or all conditions to be true when there are multiple constraints. We need similar functionality here. It can be achieved with a bit of creativity (repeat the failure activity as the single failure path after each of the original activities use the If Condition to write logic that would…145 votes
There are numerous instances when an output (statistics) or error file has to be mailed to administrators. Email as an activity will help in implementing this functionality137 votes
Thank you for your feedback! We will evaluate this as part of our product roadmap.
- Don't see your idea?