Static IP ranges for Data Factory and add ADF to list of Trusted Azure Services
It is not currently possible to identify the IP Address of the DF, which you need for firewall rules, including Azure SQL Server firewall....
We want to share the great news that ADF has been added to the list of “Trusted Azure service” for Azure Key Vault and Azure Storage (blob & ADLS Gen2)!! Now you can enable “Allow trusted Microsoft services” on AKV and Azure Storage for better network security, and your ADF pipelines will continue to run. There are two caveats to pay attention to: (1) In order for ADF to be considered as one of the “Trusted Microsoft services” you need to use MSI to authenticate to AKV or Azure Storage in the linked service definition, and (2) If you are running Mapping Data Flow activity – “Trusted Azure service” is not supported for Data Flow just yet and we are working hard on it.
What is coming up? Here are the additional enhancements we are making for better network security:
- Static IP range for Azure Integration Runtime so that you can whitelist specific IP ranges for ADF as part of firewall rules. ETA is next few months.
- Support service tag for ADF
We will provide an update as soon as these enhancements becomes available. Please stay tuned and thank you for using ADF!
Hard to believe the ADF cannot dip directly into Azure VNETs or have a single source IP. What good is it if you have to Integration Services everywhere?
Azure Data Factory can not access directly to Azure Storage which is enabled firewall. We need to use self hosted integration runtime.
ADF would like to be added in trusted Microsoft service in Azure Storage firewall settings.
David Van Wart commented
It is absolutely unbelievable that Azure doesn't have a fix for this while still not having any thing better for native CosmosDB backup than the built in (4 hour we keep 2 set) or using DF. Very disappointing.
Martin O'Gorman commented
Could we add a NSG Service Tag for datafactory. The above IP address allocation would also solve the problem.
Francis Thorne commented
I am also having the same problem with azure storage gen2. After enabling firewall, data factory can’t access blobs / files under the storage account. Is their a solution in the pipeline for this its a real showstopper.
Have same problem with azure storage gen2. After enabling firewall, data factory can’t access data lake under storage account. Looking for solution....
Evgeny Lutsky commented
Any update for this?
Sebastien Gissinger commented
Just found a workaround.
Create an Azure KeyVault linked service with an Azure KeyVault which has firewall enabled.
Then create, for example, a SQL Database linked service which uses this Azure KeyVault linked service for the password.
You got an error, open it :
"Failed to get the secret from key vault, secretName: mySecretName, secretVersion: , vaultBaseUrl: https://MYKEYVAULT.vault.azure.net/. The error message is: Client address (%IP_ADRESS%) is not authorized and caller is not a trusted service".
IP may vary with location, that's why I didn't it.
The real question is why DataFactory is not an Azure trusted service ?
Robert Lenior commented
Is this still an issue?! I hope this will be resolved soon now.
ADF hits AKV for retrieving keys through LS, VNet configuration. Job fails due to client up adddree not authorised and called is not trusted service
While hitting it has variable public ip on each ADF execution. This need to be corrected
We have restrictions to connect to a German cloud for production. It requires us to whitelist the IP address of the incoming traffic at the Firewall. Can we have datafactory expose IP address to connect to the target data base. Otherwise Data factory will not be a viable option for us
Conor McCarthy commented
We have a provider that whitelists our IP to give us access to their SFTP site. Currently accessing it from SSIS running on an Azure VM. Would really like to use ADF, but impossible if we can't provide an IP address.
I tried the same, by creating Self hosted IR in one of the on-prem machine and whitelist the IP on target server. It do work, but then sometimes it fails, as in I have some 16 .csv files to be copied from SFTP to blob, it is not able to copy all that, it fails for some files, I am not sure of reason.
Vimal Amaldas commented
Can this be resolved by creating Self Hosted integration Runtime VM with ADF and whitelisting the IP of the Self Hosted IR VM on target datastore?
I have an SFTP server on an AZURE VM. It's on a Vnet with firewall rules. How am I supposed to create that linked service in Data Factory? I am able to create an API connection for use with a logic app by providing the certificate and in the firewall adding the range of addresses that logic app uses. Is there some other way I can allow the data factory to gain access to the SFTP linked service? I was looking at application security groups, will that work? What if I want to later git rid of the VM and use a 3rd party SFTP to have my customers dump their data to? Thanks!
Yes, we also have the case of an external SFTp site that requires whitelisting to connect.
I find it amusing that a service like Data Factory promoted as the solution to solve data movement problems has a bigger problem accessing internal Azure services.
Wlodek Bielski commented
This is absolute showstopper for using Azure Data Factory with e.g. Azure Analysis Services. If we set firewall on AAS, then it won't allow ADF to access the model - as there is no option "Allow access to Azure services", nor it's possible to whitelist ADF IP.
Pavel Leonau commented
Would be great to have an opportunity to get the IP address of a Data Factory service (or the white list if available) in order to use the process on production server.
Paul Pavlinovich commented
I'm amazed you cannot do this now?!? Back to Informatica I guess.