Skip to content

JupyterHub App

About

JupyterHub brings the power of notebooks to groups of users. It gives users access to computational environments and resources without burdening the users with installation and maintenance tasks.

How it works

The JupyterHub app is run as a container (like any other Cloudron app). The hub app manages user login and creates a separate container for each user's notebooks. The notebook container is created from the c.DockerSpawner.image setting (see below on how to customize this). Each notebook container is run with a configurable memory limit based on c.Spawner.mem_limit. The advantage of this approach is that you can control how much compute/memory is allocated to each user and a notebook cannot bring down the whole server.

If you change the notebook image or any configuration, the notebook containers have to be "recreated". To help with this, the /app/code/remove_notebook_containers.py script can be run. Note that this only removes the containers but not the data of user's notebooks itself.

Selecting a notebook image

By default, the app uses the jupyter/datascience-notebook. The upstream Jupyterhub project maintains many other notebook images.

To use a different notebook image, use the File Manager to place custom configuration under /app/data/customconfig.py. For example, add a line like below:

c.DockerSpawner.image = 'quay.io/jupyter/all-spark-notebook:lab-4.1.5'

It is also possible to use any arbitrary docker image built from jupyter/base-notebook or any of the specialized notebooks that are base on this. For example:

FROM quay.io/jupyter/all-spark-notebook

RUN conda install --yes -c conda-forge git-lfs

Build and push the above image to Dockerhub. Then, update c.DockerSpawner.image as shown above.

To apply the configuration, restart the app using the Restart button.

Remove existing notebook containers

For the container image to take effect, you have to remove any existing docker notebook containers using the /app/code/remove_notebook_containers.py script. Notebook data will be intact despite deleting the container.

Notebook Memory limit

By default, notebooks are given 500M (including swap). This can be changed by editing /app/data/customconfig.py.

c.Spawner.mem_limit = '1G'

To apply the configuration, restart the app using the Restart button.

Remove existing notebook containers

For the memory limit to take effect, you have to remove any existing docker notebook containers using the /app/code/remove_notebook_containers.py script. Notebook data will be intact despite deleting the container.

Notebook persistence

All notebooks are part of the application backup and persisted across updates.

Libraries installed using conda are not part of the backup and are part of the notebook container. Idle notebooks are shutdown over time but they are not destroyed. This means that if any libraries installed in notebook container will generally persist.

If the notebook container is deleted, any libraries that were previously installed have to be re-installed.

Multiple user environments

By default, the app allows a user to create multiple environments. You can disable this by setting allow_named_servers to False in /app/data/customconfig.py.

c.JupyterHub.allow_named_servers = False

Sharing

To share notebooks between users with a shared directory (/shared):

  • Create /app/data/shared directory and make it accessible by all users. Open a JupyterHub Web Terminal.
mkdir /app/data/shared
chmod 0777 /app/data/shared
  • Then, edit /app/data/customconfig.py , add the configuration:
    c.DockerSpawner.volumes['/app/data/shared'] = '/home/jovyan/shared'
  • Remove existing user notebook workspaces by running /app/code/remove_notebook_containers.py in JupyterHub's Web Terminal.

  • Restart the app

The above approach can be extended to use Cloudron Volumes . Please note that when using Volumes the data in the shared directory is not backed up because Volume storage is not part of backups.

Extensions

It's possible to enable and install extensions. However, as note in Notebook persistence, the extensions installed using pip or conda are not part of the backup and thus they need to re-installed when the notebook image is changed.

Other custom configuration

Use the File Manager to place custom configuration under /app/data/customconfig.py.

See the docs for more information.