Amazon S3 sink
We'd really need the ability to write to S3 rather than just read.
Many larger clients (big groups with multiple IT departments) will often have both Azure and Amazon and ADF is getting disqualified from the benchmarks against Talend Online and Matilion because won't push to other cloud services...
Please help ^^
David GROSPELIER commented
The ability to push data to S3 from other systems is a requirement. Please consider it because we have several projects where ADF was not considered because of this.
Ronieri Marques Ramalho commented
This was first posted on Dec 2018, we are on 2021 and still don't have a S3 connector for writing? It's too basic, so sad
Henri Aalto commented
Attention, Microsoft Engineering. Data interchange between all major public clouds (Amazon, Azure and Google) is a requirement for every enterprise customer. Business Divisions or external partners may have data in either cloud and there is a real need to transfer data as easily as possible between them.
Snowflake is already demonstrating cross-cloud data sharing and I suppose Microsoft is willing to stay in the game. Or should we just inform our enterprise customers who have such a need that they should opt to use Snowflake instead?
This is a strategic decision to position yourselves to be either on the field, playing, or on the bench, watching others play. I'm more than happy to discuss the importance of this so feel free to reach out. Or better yet, just get this done.
P.S: to anyone else needing this feature; just create a Databricks (or Spark) job to which you pass the required parameters form the Data Factory and have it move the data back to S3. Remember to add a secret scope to keep the relevant credentials in Key Vault and not in the scripts. See: blog.seandaylor.com/mounting-aws-s3-and-azure-data-lake-on-databricks
Richard Goerwitz commented
It's hard to imagine a modern cloud data movement service that would be unable to write data to an s3 bucket, yet I don't see any way to get ADF (or a logic app) to do that out of the box. This is surprising.
it is shame that can only read from S3! Is that in the roadmap?
Use notebook activity to write to s3
Imre Pühvel commented
This would be useful in backup archive management for 3-rd copy location when no on-premise options are available.
Jeremiah Barndt commented
Currently can only use S3 as a source.