Developing code using Python, SQL, R, and Scala. The different languages can be combined in one notebook. You can trigger commands in cells that are written in another language with the % sign: %python or %SQL.
It’s also possible to use autocomplete and automatic formatting for Python and SQL.
Run other notebooks from a notebook, use %run or dbutils.notebook.run().
Environments can be customized with libraries in notebooks and jobs. Java, Scala, and Python libraries can be uploaded.
Create and manage (scheduled) jobs to automatically run tasks directly from the notebook UI.
Results can be exported in two different formats: .html or .ipynb. This can be done in the notebook toolbar by selecting ‘File’ > ‘Export’ and selecting the export format.
Use Git integration with Databricks repos to store notebooks. It supports the following operations: clone a repository, commit and push, pull, branch management and visual comparison.
If you have notebooks associated with a Delta Live Tables pipeline, you can access the pipeline’s details, initiate a pipeline update, or remove the pipeline.
As stated above, Databricks notebooks facilitate real-time collaboration with colleagues. You can work together in a notebook, add comments in notebooks, and the notebooks can be shared.