SFTP (and FTPS) protocol support for Azure Files
Exposing the SFTP protocol would facilitate a bunch of scenarios where today 2 VMs (with all the management overhead that implies) are required.
Thanks for the feedback! We are interested in collecting feedback on this request – please vote for it if this is something you like to see.
We’re also interested in learning more what people want to use the SFTP/FTPS for and which protocol they prefer. Please feel free to leave us a comment letting us know more detail!
Program Manager, Azure Files
Craig R commented
Please add this feature. A good portion of companies are still stuck in SFTP land due to software and skillset limitations (i.e. they are not scripting or command-line savvy).
Furthermore, with SFTP support, we will be able to leverage the new Azure Data Factory v2 Event-based Triggers without "yet another complex hacky azure integration + vm + logic app + whatever".
We need to collect data from our building automation systems. We have approximately 500 systems that are to deliver data to a central storage. The only protocol supported is sftp for a lot of those systems. sftp access to storage in Azure would be great.
SFTP is critical for integration with data providers that don't yet support cloud-based infrastructure or are unwilling to take 'special' steps to send their data to a third party. Please implement.
Nicholas Piasecki commented
Both! In Logistics a terrifying amount of data is flat files flung around over SFTP and/or FTPS. FTPS is advantageous because you can communicate with it directly in .NET, but the .NET Framework lacks built-in support for SFTP.
I think that an SFTP access to Azure Storage (ex. File) is a real necessity.
SFTP would allow Azure Files to seemless intergrate with existing applications and processes that utililse this protocol to send/receive files
David Walmsley commented
This would be a useful addition.
Kirk Donahoe commented
Yes, please implement this feature! Preference is towards SFTP.
This is really bad , Amazon is better than this service
Adding FTP access to blob storage will allow for Flow triggers for Azure File Storage triggers
An sFTP access to Azure Storage (Blob, File, etc.) is an absolute must and I find it ridiculous Azure doesn't already have it. How else can an automated system send files to a storage if the VM's are setup as AutoScale and dynamically assigned IP's.
Mark Jeweler commented
We have over 300 scanners geographically distributed, and we need them to be able to deliver their scanned documents to Azure in order for us to process those scans. We are limited by the capabilities built in to the scanners, and the common capability of all of our scanners is SFTP. We need to be able to SFTP the files from the scanners to Azure File Storage (or even better, Azure Blob Storage); and then to be able to have a trigger (Logic App, Event Hub, etc.) respond to the incoming scanned file appearing on the SFTP storage area.
Frank Migacz commented
I have a customer who wants to redesign in PaaS an HAProxy load balancer VM that a) terminates SSL and b) routes traffic to two SFTP servers on port 22. I've been trying to tackle this at the networking level, but an SSH file transfer interface in Files (or a new, similar storage service) would be even better! This customer receives files from B2B partners for processing.
Michael Heiart commented
Receiving files via sFTP is still highly relevant for data provisioning in logistics.
Please add a vm/serverless solution.
Would very much like SFTP to Azure Files, Blobs, please attend to my votes...
Firstly I just want to point there is a duplicate of this suggestion under "additional services"!
To add my thoughts around this:
We have a multi-tenant web based system for financial companies, one database per client. Part of our requirement is that we receive files from our clients and other 3rd parties which usually but not exclusively contain financial data which we then load automatically into the client's individual database.
At the moment we have an SFTP product which is on a VM which for historical reasons is not hosted within Azure (it could be). We have one login per-client/per 3rd party. In other words one 3rd party data supplier may supply data for more than one client and each client may have one or more additional logins of their own. Each of these logins will put files into a separate physical folder to avoid intermingling client data. The server is also IP restricted.
We have multiple processes in Azure that poll the SFTP server for a particular login/folder (for security we may have separate read/write logins for the same folder). When a file is found it is "downloaded" to local or blob storage and then processed. Some of these processes are "per client" because they have specific requirements; some are generic and run across multiple clients (i.e. they have some common file being received). Clearly separation is important for the process that handles multiple clients/tenants.
So some of the things I would personally like to see are:
- SFTP as a service (i.e. no requirement for installing on a VM).
- IP restrictions (for keeping track of why we add an allowed IP/IP range it is important that each restriction can at least have a text description).
- Our current server can't do this but I would like to see IP restriction associated with a login or group of logins. The server would reject incoming connection if the IP address isn't listed for any login. If the connection passes that restriction then there would be a second check to make sure the login comes from an IP associated with it.
- Multiple logins
- Files go to blob storage
- Each login can write to a different blob container.
- Possibly logins could be grouped which would allow:
* Setting up of the same IP restrictions across a group of logins.
* Associating a single blob container with a group of logins if desired.
- When a file is transferred it would be great to be able to trigger an action in Azure. Possibly via a queue or some sort of notification (which would save polling). Ideally triggers could be configured at a general, login group or individual login level. The queue entry/event would tell us the date/time, login name, group name(s) (if applicable) and the file name and blob location.
- Obviously security is important so perhaps encrypting the files on blob storage and having a mechanism to decrypt using Key Vault or something similar.
- From the sending side it would be good if we could add a file to the SFTP server for an external 3rd part to pick up just by adding it to a Blob container (i.e. bypassing the need to use the SFTP protocol "internally" if we want to).
- Personally I am only interested in SFTP!
James Hancock commented
We need hosted SFTP functionality for legacy partners. Adding this to blob storage would solve the problem nicely without having to have a virtual machine.
We need to send performance data from an IBM mainframe to Azure blob storage or Azure data lake store. Need the mainframe to generate the files and push them into Azure.
I am actually surprised that this level of basic functionality isn´t present.
It will be great if you support an sftp protocol for azure files, because it's a widely used protocol so it's easy to explain to our customers (companies) how can they send and receive files. Also it's easy to integrate with etl processes.