Skip to content

Note

Click here to download the full example code

Neuroglancer & CloudVolume#

This tutorial will show you how to access data from Neuroglancer using CloudVolume.

Neuroglancer is a WebGL-based viewer for volumetric data. You may have used it to browse some of the recent large EM datasets. If you want to programmatically access/download these data, you need CloudVolume. CloudVolume is an excellent Python library developed by William Silversmith (Seung lab, Princeton) and others. While CloudVolume is not directly related to Neuroglancer, it shares much of its functionality. As a rule of thumb: if you can view a dataset in Neuroglancer, you can download that data using CloudVolume. For example:

  1. FlyWire is a segmentation of an entire Drosophila brain. This dataset is very much work in progress and you will to register and apply for access. Check out FAFBseg for a fairly mature interface built on top of NAVis.
  2. Google's flood-filling segmentation of an entire Drosophila brain.
  3. The Allen Institute's MICrONs datasets. We have a separate tutorial on this!
  4. The Janelia hemibrain connectome.

CloudVolume supports the backends/data formats of these and many up-and-coming datasets. You can use it to query the segmentation directly, and to fetch meshes and skeletons (if available). NAVis & friends provide simple interfaces for some of the datasets (see e.g. the neuPrint and the MICrONs tutorials) but there is also some lower-level option to pull neurons into NAVis via CloudVolume.

First of all, you will want to make sure to cloud-volume is installed and up-to-date:

pip install cloud-volume -U

Once that's done we can start pulling data using cloud-volume. In this example here, we will use the Google segmentation of the FAFB dataset:

import navis
import cloudvolume as cv

Before we connect to the database we have to "monkey patch" cloudvolume using navis.patch_cloudvolume. That will teach cloudvolume to return NAVis neurons:

# This needs to be run only once at the beginning of each session
navis.patch_cloudvolume()

Now we can connect to our data source. Here we connect to the Google segmentation of the FAFB dataset:

# Don't forget to set `use_https=True` to avoid having to setup Google credentials
vol = cv.CloudVolume(
    "precomputed://gs://fafb-ffn1-20200412/segmentation", use_https=True, progress=False
)

Fetch some (mesh) neurons:

# Setting `as_navis=True` we will get us MeshNeurons
m = vol.mesh.get([4335355146, 2913913713, 2137190164, 2268989790], as_navis=True, lod=3)
m
<class 'navis.core.neuronlist.NeuronList'> containing 4 neurons (3.4MiB)
type name id units n_vertices n_faces
0 navis.MeshNeuron None 2137190164 1 nanometer 8631 17612
1 navis.MeshNeuron None 2268989790 1 nanometer 9450 19189
2 navis.MeshNeuron None 4335355146 1 nanometer 14784 30420
3 navis.MeshNeuron None 2913913713 1 nanometer 15388 31213
navis.plot3d(
    m,
    legend_orientation="h"  # few neurons, so we can afford a horizontal legend
    )
# And one 2d plot for the tutorial thumbnail
import matplotlib.pyplot as plt
fig, ax = navis.plot2d(m[1], method='2d', view=("x", "-y"))
ax.set_axis_off()
ax.grid(False)
plt.tight_layout()

plot 01 remote cloudvolume

This also works for skeletons. Note though that not all datasets contain precomputed skeletons! For those cases you might want to check out navis.skeletonize to skeletonize neuron meshes.

sk = vol.skeleton.get([4335355146, 2913913713, 2137190164, 2268989790], as_navis=True)
sk
<class 'navis.core.neuronlist.NeuronList'> containing 4 neurons (2.1MiB)
type name id n_nodes n_connectors n_branches n_leafs cable_length soma units created_at origin
0 navis.TreeNeuron SWC 4335355146 27460 None 2184 2192 8219293.5 None 1 nanometer 2024-09-18 12:21:23.403101 string
1 navis.TreeNeuron SWC 2913913713 28640 None 2369 2381 8557471.0 None 1 nanometer 2024-09-18 12:21:23.884636 string
2 navis.TreeNeuron SWC 2137190164 15404 None 867 870 4712362.0 None 1 nanometer 2024-09-18 12:21:24.117259 string
3 navis.TreeNeuron SWC 2268989790 18105 None 945 946 5545407.5 None 1 nanometer 2024-09-18 12:21:24.389731 string

Try it out!

If you are working a lot with NeuroGlancer and need to e.g. generated or parse URLs, you might want to check out the nglscenes package.

Total running time of the script: ( 0 minutes 4.111 seconds)

Download Python source code: plot_01_remote_cloudvolume.py

Download Jupyter notebook: plot_01_remote_cloudvolume.ipynb

Gallery generated by mkdocs-gallery