No exception is thrown back to notebook in case of sql session killed using Spark
Experiencing a weird issue while fetching data from SQL Server into Databricks.
Below statement is used to fetch the data
spark.read.jdbc(jdbcUrl, "EAP.vw_ActiveAssignments", connectionProperties).write.mode(SaveMode.Overwrite).saveAsTable("tms.Assignments")
In case session gets killed at SQL Server, no exception is thrown back to notebook. Notebook considers it as a successful command and continue executing next.
Due to partial data pull, the subsequent logic get impacted.
It seems if we directly use JDBC to read data, exception will be catch immediately once session get killed at SQL. Not sure why it is not the case with Spark SQL.
Can you please consider fixing this issue on priority.