Troubleshooting

Please do get in touch with any questions at all - we really want to help you configure ContainDS Dashboards and get up and running.

Debug Logs

Please consult your JupyterHub distribution to understand how to access logs. For example, here are instructions for The Littlest JupyterHub.

For problems within the Dashboards menu in JupyterHub, or JupyterHub startup problems, please see jupyterhub logs.

For problems at the stage where Dashboards seem to be built, but don’t work once accessed, you need to find out where the ‘notebook server’ logs are located. Each Dashboard is run as a named server, just like the default ‘My Server’ where each user is able to run their Jupyter notebooks.

If your dashboard is called ‘mydash’, by default the corresponding notebook server will be called ‘dash-mydash’. Check logs for that named server.

To force the notebook server to display extra debug information, add the following to your jupyterhub_config.py:

c.Spawner.debug = True

Delete the existing named server from the user’s Home page in Jupyter, then try building and accessing the Dashboard again.

TLJH named servers show 404 not found

Older installations of The Littlest JupyterHub have a problem getting named servers to work. Dashboards ultimately lead to a 404 not found error, and so do any named servers that you create directly using the table on the regular JupyterHub Home page. The fix is just to upgrade one particular Python package called jupyterhub-traefik-proxy. Versions older than 0.1.6 give the 404 errors.

sudo /opt/tljh/hub/bin/python3 -m pip install jupyterhub-traefik-proxy==0.1.6

sudo tljh-config reload

Conda Kernels or Packages not found

If your usual Conda environment doesn’t seem to be available for your dashboard, you may need to allow your users to select which Conda environment is required when they create the dashboard. See Conda env configuration details.

For Voilà, there is an extra consideration that a specific Conda Kernel may also be specified within the ipynb file metadata:

Where the Voilà debug logs show: Could not find a kernel named 'conda-env-myenv-py', will use  'python3' or similar, this means Voilà cannot find one of the Conda environments that you have made available to your notebooks. It may be that the notebook has come from a different server, so the kernel does not exist. Check to see if the kernel does exist and that it works correctly for Jupyter notebooks when run normally. Often, a kernel is registered in Jupyter to correspond to a each available Conda env.

If it seems that only Voilà cannot find the kernel, it may be because you are using nb_conda_kernels. This is an extension for Jupyter that auto-discovers all Conda envs and makes them available as kernels. However, because they are auto-dsicovered within Jupyter each time, they are not properly registered for external services such as Voilà to locate.

Auto-discovery of Conda envs is convenient, but nb_conda_kernels has not been maintained recently and some work is required to bring compatibility with the latest Jupyter releases. Our suggestion is to remove nb_conda_kernels and register each Conda env manually - with the same name to ensure existing notebooks can still find the correct environments.

If you inspect an existing raw ipynb file which has ‘metadata’ towards the end (it is just a text file), you can see the exact values generated by nb_conda_kernels for ‘name’ and ‘display_name’:

"metadata": {
    "kernelspec": {
    "display_name": "Python [conda env:myenv]",
    "language": "python",
    "name": "conda-env-myenv-py"
}...

In the activated myenv conda env you can install using these exact names taken from above:

python -m ipykernel install --name "conda-env-myenv-py" --display-name "Python [conda env:myenv]"

Remove nb_conda_kernels from the appropriate conda env:

conda uninstall --force nb_conda_kernels

If you don’t remove nb_conda_kernels then each conda env will appear twice.

If you must keep auto-discovery of new environments, it is recommended to also manually register each env (ipykernel install) that may be used for Voilà Dashboards, perhaps with slightly different names to the auto-discovered equivalents.

Problems after upgrading

When upgrading from an older version you must upgrade both hub and user environments (user env may not be applicable to DockerSpawner installations, but a new image may be required).

If a database upgrade is required, users will be prompted within the JupyterHub website - see the Changelog.

See the guide on Upgrading.

Dashboards (and servers) keep dying

If you find that dashboards (and also servers) seem to disappear after a few minutes of inactivity, it may be that you are running a process to ‘cull idle servers’.

The default behavior for ContainDS Dashboard servers is to always report activity back to the hub (even if there hasn’t been any) which should normally keep them alive. Adjust the Server Timeouts and Keep Alive if you have changed the defaults. Or increase the timeout value in your server idle culling service.

For example in The Littlest JupyterHub, this default service can be disabled as described here.

Dashboard just shows Jupyter server as normal

If the dashboard appears to be a regular Jupyter server instead of the presentation (e.g. Voilà, Streamlit etc) then it may be that you are still using the standard spawners. You need to use the ‘Variable’ spawners supplied with ContainDS Dashboards.

See Setup for details of which c.JupyterHub.spawner_class to set.

Note that from version 0.1.0 onwards, DockerSpawner needs to be replaced with VariableDockerSpawner. (Previous versions worked with the standard DockerSpawner.)

I can’t choose profiles or other spawner options for dashboards

If your spawner is configured so that normal Jupyter servers allow the user to first select options (such as Docker image or another profile) and you also want these options to be selected by the dashboard creator, try Spawner User Options Form.

Dashboards work but “object NoneType” error in logs

You may see this in the singleuser dashboard server logs:

ERROR:tornado.application:Uncaught exception GET /user/danlester/dash-example/ (127.0.0.1)
HTTPServerRequest(protocol='http', host='127.0.0.1:42712', method='GET', uri='/user/danlester/dash-example/', version='HTTP/1.1', remote_ip='127.0.0.1')
Traceback (most recent call last):
File "/opt/conda/lib/python3.7/site-packages/tornado/web.py", line 1703, in _execute
    result = await result
File "/opt/conda/lib/python3.7/site-packages/jhsingle_native_proxy/websocket.py", line 94, in get
    return await self.http_get(*args, **kwargs)
File "/opt/conda/lib/python3.7/site-packages/jhsingle_native_proxy/proxyhandlers.py", line 592, in http_get
    return await self.proxy(self.port, path)
File "/opt/conda/lib/python3.7/site-packages/jhsingle_native_proxy/proxyhandlers.py", line 586, in proxy
    return await self.oauth_proxy(port, path)
TypeError: object NoneType can't be used in 'await' expression

This is actually normal behavior, and is due to a workaround in some core JupyterHub code.

It can be safely ignored, and hasn’t in itself been known to cause any problems with dashboards.

Streamlit Components aren’t working

In newer browsers, Streamlit Components may not work correctly. This is because components make heavy use of iframes which do not always send cookies so ContainDS Dashboards isn’t able to identify the user correctly. You’ll see something like this in the Javascript Console:

A cookie associated with a cross-site resource was set without the ‘SameSite’ attribute. It has been blocked, as Chrome now only delivers cookies with cross-site requests if they are set with ‘SameSite=None’ and ‘Secure’.

To fix, your site needs to be running under HTTPS and your user environment needs to be running Python 3.8 or later.

In your jupyterhub_config, add:

c.VariableMixin.extra_presentation_launchers = {
    'streamlit': {
        'env': {'JUPYTERHUB_COOKIE_OPTIONS': '{"SameSite": "None", "Secure": true}'}
    }
}

or you could set that environment variable at a higher level to affect all singleuser servers, for example through c.Spawner.environment or c.Spawner.env_keep. See JupyterHub Spawner docs.