Static IP ranges for Data Factory and add ADF to list of Trusted Azure Services
It is not currently possible to identify the IP Address of the DF, which you need for firewall rules, including Azure SQL Server firewall....
Great news – static IP range for Azure Integration Runtime is now available in all ADF regions! You can whitelist specific IP ranges for ADF as part of firewall rules. The IPs are documented here: https://docs.microsoft.com/en-us/azure/data-factory/azure-integration-runtime-ip-addresses#azure-integration-runtime-ip-addresses-specific-regions. Static IP ranges for gov cloud and China cloud will be published soon!
Please refer to this blog post on how you can use various mechanisms including trusted Azure service and static IP to secure data access through ADF:
Service tag support will be made available in next few weeks. Please stay tuned!
If your network security requirement calls for ADF support for VNet and cannot be met using Trusted Azure service (released in Oct 2019), static IP range (released in Jan 2020), or service tag (upcoming), please vote for VNet feature here: https://feedback.azure.com/forums/270578-data-factory/suggestions/37105363-data-factory-should-be-able-to-use-vnet-without-re
Azure Data Factory can not access directly to Azure Storage which is enabled firewall. We need to use self hosted integration runtime.
ADF would like to be added in trusted Microsoft service in Azure Storage firewall settings.
David Van Wart commented
It is absolutely unbelievable that Azure doesn't have a fix for this while still not having any thing better for native CosmosDB backup than the built in (4 hour we keep 2 set) or using DF. Very disappointing.
Martin O'Gorman commented
Could we add a NSG Service Tag for datafactory. The above IP address allocation would also solve the problem.
Francis Thorne commented
I am also having the same problem with azure storage gen2. After enabling firewall, data factory can’t access blobs / files under the storage account. Is their a solution in the pipeline for this its a real showstopper.
Have same problem with azure storage gen2. After enabling firewall, data factory can’t access data lake under storage account. Looking for solution....
Evgeny Lutsky commented
Any update for this?
Sebastien Gissinger commented
Just found a workaround.
Create an Azure KeyVault linked service with an Azure KeyVault which has firewall enabled.
Then create, for example, a SQL Database linked service which uses this Azure KeyVault linked service for the password.
You got an error, open it :
"Failed to get the secret from key vault, secretName: mySecretName, secretVersion: , vaultBaseUrl: https://MYKEYVAULT.vault.azure.net/. The error message is: Client address (%IP_ADRESS%) is not authorized and caller is not a trusted service".
IP may vary with location, that's why I didn't it.
The real question is why DataFactory is not an Azure trusted service ?
Robert Lenior commented
Is this still an issue?! I hope this will be resolved soon now.
ADF hits AKV for retrieving keys through LS, VNet configuration. Job fails due to client up adddree not authorised and called is not trusted service
While hitting it has variable public ip on each ADF execution. This need to be corrected
We have restrictions to connect to a German cloud for production. It requires us to whitelist the IP address of the incoming traffic at the Firewall. Can we have datafactory expose IP address to connect to the target data base. Otherwise Data factory will not be a viable option for us
Conor McCarthy commented
We have a provider that whitelists our IP to give us access to their SFTP site. Currently accessing it from SSIS running on an Azure VM. Would really like to use ADF, but impossible if we can't provide an IP address.
I tried the same, by creating Self hosted IR in one of the on-prem machine and whitelist the IP on target server. It do work, but then sometimes it fails, as in I have some 16 .csv files to be copied from SFTP to blob, it is not able to copy all that, it fails for some files, I am not sure of reason.
Vimal Amaldas commented
Can this be resolved by creating Self Hosted integration Runtime VM with ADF and whitelisting the IP of the Self Hosted IR VM on target datastore?
I have an SFTP server on an AZURE VM. It's on a Vnet with firewall rules. How am I supposed to create that linked service in Data Factory? I am able to create an API connection for use with a logic app by providing the certificate and in the firewall adding the range of addresses that logic app uses. Is there some other way I can allow the data factory to gain access to the SFTP linked service? I was looking at application security groups, will that work? What if I want to later git rid of the VM and use a 3rd party SFTP to have my customers dump their data to? Thanks!
Yes, we also have the case of an external SFTp site that requires whitelisting to connect.
I find it amusing that a service like Data Factory promoted as the solution to solve data movement problems has a bigger problem accessing internal Azure services.
Wlodek Bielski commented
This is absolute showstopper for using Azure Data Factory with e.g. Azure Analysis Services. If we set firewall on AAS, then it won't allow ADF to access the model - as there is no option "Allow access to Azure services", nor it's possible to whitelist ADF IP.
Pavel Leonau commented
Would be great to have an opportunity to get the IP address of a Data Factory service (or the white list if available) in order to use the process on production server.
Paul Pavlinovich commented
I'm amazed you cannot do this now?!? Back to Informatica I guess.
Andy Ball commented
Would like this to . We have a INFOSec requirement to limit access to HDInsight using on Prem Addresses only - ie block access to people outside the company. If we do this via a NSG , it breaks Data Factory connecvity to HDInsight which is used to run a python script as part of transform .
So at present the only way I can see to fix this , is to change the NSG to allow traffic on Port 443 to the whole Azure IP range which is very open / and has to be checked / refreshed weekly.