Thanks for the valid suggestion. Your feedback is now open for the user community to upvote which allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.Nick shared this idea ·
This is very important for us as we want to run the SFTP upload on a self-hosted integration runtime due to IP whitelisting on the server that we upload to. i.e. I can create a databricks or azure functions upload, but we can't really whitelist the public ips used by the these systems. i.e. We need to the upload to come from our IP address which means a copy activity.
Pleas see Kay’s comment below. Now that AS works on PQ, this of course becomes much easier. However still some work & testing to make each connector available, so likely will evaluate which are more suited to traditional BI and prioritize accordingly
I would dearly love to see the Azure Databricks spark connector available for Azure Analysis Services.
The work around at the moment is to create external Polybase tables in Azure Data Warehouse which read the underlying Parquet files. This just adds unneeded complexity and cost to our architecture as we don't want/need to store any data in Azure SQL DW.
I haven't seen much progress at all in Azure Analysis Services lately so perhaps all the attention is elsewhere at the moment?
Yes, we also have the case of an external SFTp site that requires whitelisting to connect.