Add ability to import Databricks Notebook
As a Data Engineer, I want to import my existing Databricks notebooks directly from a Databricks workspace and have that have it converted to run on Azure Synapse.
When the notebook imports, references to incompatible features should either be
- commented out, indicated as incompatible features, and have links supplied to documentation for the synapse equivalent
OR BETTER YET
- provided with a wizard to handle the setup of the workaround
Notebooks reference existing schemas and tables, so enable automated import of schemas and external table definitions from a running databricks’ hive meta store.
External tables reference storage so enable addition of the storage in the datahub.
Notebooks reference mount points in databricks, so allow users to create reference to azure storage and implement a seamless passthru for notebooks to access exactly the same.
Just some ideas, thanks CR