Add excel file as source.368 votes
Thank you for your feedback. We are evaluating this ask and will keep everyone posted on future plan for this.
Some customers have the necessity to extract information from Google Analytics in order to create a data lake or sql dw to gather marketing insights mixing another kind of data.
Now we have some custom SSIS packages that are paid or developing some custom code.
Or if it is not possible in Azure Data Factory, could have anoter way to extract this data in native connector in Azure … maybe Logic Apps271 votes
Thank you for your feedback. We will evaluate this ask as part of the product roadmap.
can we have a copy activity for XML files, along with validating schema of an XML file against XSD.. this would be helpful.. if schema validation is success then copy else fail the activity.. this will be useful for below scenarios..
1. Blob to Blob
2. Blob to SQL
3. SQL to Blob
if all above can work with specified schema that would be great...263 votes
Thank you for your feedback. We will evaluate this ask as part of product roadmap.
We need to use bitbucket for a project. We are mirroring our azure devops repo with the pipelines to bitbucket. It would be easier if there was integration with bitbucket.214 votes
Thank you for your feedback. We will evaluate this ask as part of our product roadmap.
There are numerous instances when an output (statistics) or error file has to be mailed to administrators. Email as an activity will help in implementing this functionality141 votes
Thank you for your feedback! We will evaluate this as part of our product roadmap.
Source and sink.132 votes
Thank you for the feedback. We will look into this.
Pause/Start Azure SQL Data Warehouse from ADF82 votes
Thank you for your feedback. We are now evaluating this ask as part of our product roadmap.
Can it consume or push data to Azure app service API? Supporting Swagger API.78 votes
Thank you for the feedback. We will look into this.
We use this approach to encrypt sensitive flat files at rest. Please add a feature to ADF to support reading from encrypted flat files in blob storage:
Thanks for the feedback. We are looking at this.
Currently, data factory has alerts for failed and succeeded runs only.
There are multiple other conditions that need action, so the user should be alerted:
- Timed out runs
- Data gateway updates required
- Linked service credentials expired/expiring soon
Can alerts for these conditions be added?21 votes
It would be great to get able to see the data coming from the data sources or after some transformation.18 votes
When we have SqlSource for copy activity, there should be an option to specify scriptpath. If the SQL query is very big, it is very difficult to put the whole content in a single line.
Below is the existing support. We should have support for scriptpath inside source key.
"sqlReaderQuery": "SELECT TOP 300 * FROM dbo.Employee"
Thanks for the feedback. This is a common ask and we will see how to improve the experience.
Currently it appears that the ADF copy wizard does not use Polybase in SQL DW (CREATE EXTERNAL TABLE AS SELECT...) in order to export the contents of a table into blob storage. As this will be much faster, please support this. Also, if you're using the copy wizard to copy from DW to DW, please Polybase out and Polybase in.
The same should apply to SQL DW to Azure Data Lake Store as Polybase is now supported to do that.13 votes
Thanks for the feedback. To help us further evaluate the ask, kindly comment more on under which scenario you want to extract data out from SQL DW.
Allow a function to be defined for a dataset, something like a WHERE clause. The Oracle on prem example is filtered by date but it's in the pipeline and it's text manipulation of a SQL statement which seems prone to error.11 votes
Thanks for the great feedback. We will review this requirement and update you on the next steps.
I propose that copy activity supports type coercion between source and sink data types.
For example, if source column is a String field containing "True", and sink column is a Boolean column, then the sink should be written as a true boolean and not a 4 character string.
(or whatever alternate mapping you like - this is how Convert.ToBoolean(String) works in c#)
Thanks for the feedback. We are looking at supporting some degree of data type conversion in the future.
The execute method on a custom activity takes an instance of IActivityLogger which can then be used to write trace messages for diagnosics purposes. Why can we not use standard .NET trace classes instead (ie. Trace or TraceSource) which are wired up to the appropriate data factory trace listener in the hosting environment. This would obviate the need to pass around the IAcitivityLogger in the custom activity code but also simplifies unit testing.5 votes
When reading weblog files from Blob Storage, CSV files comments are lines beginning with '#', which causes the copy activity to fail due to format exceptions. Suggestion is to specify comment character='#' which would ignore those lines.3 votes
For now, a workaround is to use the feature of skipping incompatible rows to ignore those comment lines during copy. Check https://docs.microsoft.com/en-us/azure/data-factory/data-factory-copy-activity-fault-tolerance
My use case is to call different stored procedures. From ADF perspective I want to know, whether the stored procedure was executed successfully, or not. For this use case I do not want to specify input and output tables. ADF is used for orchestration only. My problem: The editor forces me to provide input/output information. Also the documentation says, that this information is necessary. Why?
Thanks for the great feedback. We will review this requirement and get back to you…
- Don't see your idea?