Skip to content

Note

Click here to download the full example code

Neuroglancer & CloudVolume#

This tutorial will show you how to pull data from Neuroglancer using CloudVolume.

Neuroglancer is a WebGL-based viewer for volumetric data. You may have used it to browse some of the recent large EM datasets. If you want to programmatically access/download these data, you need CloudVolume. CloudVolume is an excellent Python library developed by William Silversmith (Seung lab, Princeton) and others. While CloudVolume is not directly related to Neuroglancer, it shares much of its functionality. As a rule of thumb: if you can view a dataset in Neuroglancer, you can download that data using CloudVolume. For example:

  1. FlyWire is a segmentation of an entire Drosophila brain. This dataset is very much work in progress and you will to register and apply for access. Check out FAFBseg for a fairly mature interface built on top of NAVis.
  2. Google's flood-filling segmentation of an entire Drosophila brain.
  3. The Allen Institute's MICrONs datasets. We have a separate tutorial on this!
  4. The Janelia hemibrain connectome.

You can find the find the source for the data you want to access by right-clicking on the layer in question and selecting the "Source" tab on the right:

Neuroglancer source

CloudVolume supports pretty much all the backends/data formats that neuroglancer does. You can use it to programmatically query the segmentation itself, and to fetch meshes and skeletons (if available). NAVis & friends provide simple interfaces for some of the datasets (see e.g. the neuPrint and the MICrONs tutorials) but there is also some lower-level option to pull neurons into NAVis via CloudVolume.

First of all, you will want to make sure to cloud-volume is installed and up-to-date:

pip install cloud-volume -U

Once that's done we can start pulling data using cloud-volume. In this example here, we will use the Google segmentation of the FAFB dataset:

import navis
import cloudvolume as cv

Before we connect to the datasource we have to "monkey patch" cloudvolume using navis.patch_cloudvolume. That will teach cloudvolume to return NAVis neurons:

# This needs to be run only once at the beginning of each session
navis.patch_cloudvolume()

Now we can connect to our data source. Here we connect to the Google segmentation of the FAFB dataset:

# Don't forget to set `use_https=True` to avoid having to setup Google credentials!
vol = cv.CloudVolume(
    "precomputed://gs://fafb-ffn1-20200412/segmentation", use_https=True, progress=False
)

Fetch neuron meshes:

# Setting `as_navis=True` we will get us MeshNeurons
m = vol.mesh.get([4335355146, 2913913713, 2137190164, 2268989790], as_navis=True, lod=3)
m
<class 'navis.core.neuronlist.NeuronList'> containing 4 neurons (1.8MiB)
type name id units n_vertices n_faces
0 navis.MeshNeuron None 2268989790 1 nanometer 10980 19189
1 navis.MeshNeuron None 2137190164 1 nanometer 10085 17612
2 navis.MeshNeuron None 4335355146 1 nanometer 17066 30420
3 navis.MeshNeuron None 2913913713 1 nanometer 18224 31213

Shortcut

Instead of vol.mesh.get(..., as_navis=True) you can also use the shortcut vol.mesh.get_navis(...) which is equivalent.

Plot!

navis.plot3d(
    m,
    legend_orientation="h",  # few neurons, so we can afford a horizontal legend
)
# And one 2D plot (for the tutorial thumbnail)
import matplotlib.pyplot as plt

fig, ax = navis.plot2d(m[1], method="2d", view=("x", "-y"))
ax.set_axis_off()
ax.grid(False)
plt.tight_layout()

tutorial remote 01 cloudvolume

This also works for skeletons:

sk = vol.skeleton.get([4335355146, 2913913713, 2137190164, 2268989790], as_navis=True)
sk
<class 'navis.core.neuronlist.NeuronList'> containing 4 neurons (2.1MiB)
type name id n_nodes n_connectors n_branches n_leafs cable_length soma units created_at origin
0 navis.TreeNeuron SWC 4335355146 27460 None 2184 2192 8219293.5 None 1 nanometer 2025-08-08 15:00:58.713775 string
1 navis.TreeNeuron SWC 2913913713 28640 None 2369 2381 8557471.0 None 1 nanometer 2025-08-08 15:00:59.002361 string
2 navis.TreeNeuron SWC 2137190164 15404 None 867 870 4712362.0 None 1 nanometer 2025-08-08 15:00:59.154182 string
3 navis.TreeNeuron SWC 2268989790 18105 None 945 946 5545407.5 None 1 nanometer 2025-08-08 15:00:59.330926 string

Note that not all datasets contain precomputed skeletons! In that case you could download the meshes and use navis.skeletonize to skeletonize them.

Try it out!

If you are working a lot with NeuroGlancer and need to e.g. generated or parse URLs, you might want to check out the nglscenes package.

Total running time of the script: ( 0 minutes 4.511 seconds)

Download Python source code: tutorial_remote_01_cloudvolume.py

Download Jupyter notebook: tutorial_remote_01_cloudvolume.ipynb

Gallery generated by mkdocs-gallery