For data-heavy notebooks with long-running cells and slow top-to-bottom runtimes, it's really useful in the databricks notebooks web UI to be able to close and reopen the browser tab, browser app, or computer and be able to come back later to all of the existing kernel state just as you left it.
It'd be really useful to enable this in Databricks Power Tools notebooks too. Something like:
- Store the active
(clusterId, contextId) per notebook editor (from /contexts/create)
- On vscode restart/reload, notebook editor reopen, etc., load any stored
(clusterId, contextId) and reuse them if that context and cluster is still alive (from /contexts/status)
- Rely on the databricks cluster to automatically terminate idle execution contexts (enabled by default):