Read a csv file and bulk load into Azure SQL table
I would like to read in a file from blob and add a row to a table in Azure SQL server. It would be nice to have a bulk import like bcp but also be able to add new rows as the file grows.
This appears to be more of an ETL task – we would suggest you look at Azure Data Factory to assist with this (which can be triggered from Logic Apps).
Add an action to allow CSV to JSON or something structured
The way I do this is move the Data Lake, then have a job there parse the data. Then I go to my ADW and pull the data in via polybase and external tables. A lot of work but ADL makes it very performant.
Juliano D commented
Hey Jason - have you managed to find a way to do this?