Make available some utilities in Spark Pools as already existing in Databricks
It would be good to have some spark utilities as already available in databricks, like executing one notebook from another notebook and passing it parameters.
dbutils.notebooks.run('NotebookName', 3600, parameters)
This is very needed to have a dynamic notebook which can trigger the execution of another notebook.
Euan Garden commented
We are working on an mssparkutils library.