skeletor.skeletonize

The skeletor.skeletonize module contains functions to for skeletonization of meshes.

There are several approaches to skeletonizing a mesh. Which one to pick depends (among other things) on the shape of your mesh and the skeleton quality you want to get out of it. In general, unless you mesh already looks like a tube I recommend looking into mesh contraction 4.

Please see the documentation of the individual functions for details but here is a quick summary:

function speed robust radii 1 mesh map 2 description
skeletor.skeletonize.by_wavefront() +++ ++ yes yes works well for tubular meshes
skeletor.skeletonize.by_vertex_clusters() ++ + no yes best with contracted meshes 3
skeletor.skeletonize.by_teasar() + ++ no yes works on mesh surface
skeletor.skeletonize.by_tangent_ball() ++ 0 yes yes works with mesh normals
skeletor.skeletonize.by_edge_collapse() - 0 no no published with [1] - never got this to work well

References

[1] Au OK, Tai CL, Chu HK, Cohen-Or D, Lee TY. Skeleton extraction by mesh contraction. ACM Transactions on Graphics (TOG). 2008 Aug 1;27(3):44.


  1. radii can also be added in postprocessing with skeletor.post.radii() 

  2. a mapping from the meshes vertices to skeleton nodes 

  3. use skeletor.pre.contract() 

  4. use skeletor.pre.contract() 

 1#    This script is part of skeletor (http://www.github.com/navis-org/skeletor).
 2#    Copyright (C) 2018 Philipp Schlegel
 3#
 4#    This program is free software: you can redistribute it and/or modify
 5#    it under the terms of the GNU General Public License as published by
 6#    the Free Software Foundation, either version 3 of the License, or
 7#    (at your option) any later version.
 8#
 9#    This program is distributed in the hope that it will be useful,
10#    but WITHOUT ANY WARRANTY; without even the implied warranty of
11#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
12#    GNU General Public License for more details.
13#
14#    You should have received a copy of the GNU General Public License
15#    along with this program.
16
17r"""
18The `skeletor.skeletonize` module contains functions to for skeletonization
19of meshes.
20
21There are several approaches to skeletonizing a mesh. Which one to pick depends
22(among other things) on the shape of your mesh and the skeleton quality you want
23to get out of it. In general, unless you mesh already looks like a tube I
24recommend looking into mesh contraction [^1].
25
26Please see the documentation of the individual functions for details but here
27is a quick summary:
28
29| function                                    | speed | robust | radii [^2] | mesh map [^3] | description                                        |
30| ------------------------------------------- | :---: | :----: | :--------: | :-----------: | ---------------------------------------------------|
31| `skeletor.skeletonize.by_wavefront()`       | +++   | ++     | yes        | yes           | works well for tubular meshes                      |
32| `skeletor.skeletonize.by_vertex_clusters()` | ++    | +      | no         | yes           | best with contracted meshes [^1]                   |
33| `skeletor.skeletonize.by_teasar()`          | +     | ++     | no         | yes           | works on mesh surface                              |
34| `skeletor.skeletonize.by_tangent_ball()`    | ++    | 0      | yes        | yes           | works with mesh normals                            |
35| `skeletor.skeletonize.by_edge_collapse()`   | -     | 0      | no         | no            | published with [1] - never got this to work well   |
36
37[^1]: use `skeletor.pre.contract()`
38[^2]: radii can also be added in postprocessing with `skeletor.post.radii()`
39[^3]: a mapping from the meshes vertices to skeleton nodes
40
41## References
42
43`[1] Au OK, Tai CL, Chu HK, Cohen-Or D, Lee TY. Skeleton extraction by mesh contraction. ACM Transactions on Graphics (TOG). 2008 Aug 1;27(3):44.`
44
45"""
46
47from .edge_collapse import *
48from .vertex_cluster import *
49from .wave import *
50from .teasar import *
51from .tangent_ball import *
52
53__docformat__ = "numpy"
54__all__ = ['by_teasar', 'by_wavefront', 'by_vertex_clusters',
55           'by_edge_collapse', 'by_tangent_ball']
def by_teasar(mesh, inv_dist, min_length=None, root=None, progress=True):
 39def by_teasar(mesh, inv_dist, min_length=None, root=None, progress=True):
 40    """Skeletonize a mesh mesh using the TEASAR algorithm [1].
 41
 42    This algorithm finds the longest path from a root vertex, invalidates all
 43    vertices that are within `inv_dist`. Then picks the second longest (and
 44    still valid) path and does the same. Rinse & repeat until all vertices have
 45    been invalidated. It's fast + works very well with tubular meshes, and with
 46    `inv_dist` you have control over the level of detail. Note that by its
 47    nature the skeleton will be exactly on the surface of the mesh.
 48
 49    Based on the implementation by Sven Dorkenwald, Casey Schneider-Mizell and
 50    Forrest Collman in `meshparty` (https://github.com/sdorkenw/MeshParty).
 51
 52    Parameters
 53    ----------
 54    mesh :          mesh obj
 55                    The mesh to be skeletonize. Can an object that has
 56                    ``.vertices`` and ``.faces`` properties  (e.g. a
 57                    trimesh.Trimesh) or a tuple ``(vertices, faces)`` or a
 58                    dictionary ``{'vertices': vertices, 'faces': faces}``.
 59    inv_dist :      int | float
 60                    Distance along the mesh used for invalidation of vertices.
 61                    This controls how detailed (or noisy) the skeleton will be.
 62    min_length :    float, optional
 63                    If provided, will skip any branch that is shorter than
 64                    `min_length`. Use this to get rid of noise but note that
 65                    it will lead to vertices not being mapped to skeleton nodes.
 66                    Such vertices will show up with index -1 in
 67                    `Skeleton.mesh_map`.
 68    root :          int, optional
 69                    Vertex ID of a root. If not provided will use ``0``.
 70    progress :      bool, optional
 71                    If True, will show progress bar.
 72
 73    Returns
 74    -------
 75    skeletor.Skeleton
 76                    Holds results of the skeletonization and enables quick
 77                    visualization.
 78
 79    References
 80    ----------
 81    [1] Sato, M., Bitter, I., Bender, M. A., Kaufman, A. E., & Nakajima, M.
 82        (n.d.). TEASAR: tree-structure extraction algorithm for accurate and
 83        robust skeletons. In Proceedings the Eighth Pacific Conference on
 84        Computer Graphics and Applications. IEEE Comput. Soc.
 85        https://doi.org/10.1109/pccga.2000.883951
 86
 87    """
 88    mesh = make_trimesh(mesh, validate=False)
 89
 90    # Generate Graph (must be undirected)
 91    G = ig.Graph(edges=mesh.edges_unique, directed=False)
 92    G.es['weight'] = mesh.edges_unique_length
 93
 94    if not root:
 95        root = 0
 96
 97    edges = np.array([], dtype=np.int64)
 98    mesh_map = np.full(mesh.vertices.shape[0], fill_value=-1)
 99
100    with tqdm(desc='Invalidating', total=len(G.vs),
101              disable=not progress, leave=False) as pbar:
102        for cc in sorted(G.clusters(), key=len, reverse=True):
103            # Make a subgraph for this connected component
104            SG = G.subgraph(cc)
105            cc = np.array(cc)
106
107            # Find root within subgraph
108            if root in cc:
109                this_root = np.where(cc == root)[0][0]
110            else:
111                this_root = 0
112
113            # Get the sparse adjacency matrix of the subgraph
114            sp = SG.get_adjacency_sparse('weight')
115
116            # Get lengths of paths to all nodes from root
117            paths = SG.shortest_paths(this_root, target=None, weights='weight',
118                                      mode='ALL')[0]
119            paths = np.array(paths)
120
121            # Prep array for invalidation
122            valid = ~np.zeros(paths.shape).astype(bool)
123            invalidated = 0
124
125            while np.any(valid):
126                # Find the farthest point
127                farthest = np.argmax(paths)
128
129                # Get path from root to farthest point
130                path = SG.get_shortest_paths(this_root, farthest,
131                                             weights='weight', mode='ALL')[0]
132
133                # Get IDs of edges along the path
134                if getattr(ig, '__version_info__', (0, 0, 0))[1] >= 10:
135                    pairs = zip(path[:-1], path[1:])
136                    eids = SG.get_eids(pairs, directed=False)
137                else:
138                    eids = SG.get_eids(path=path, directed=False)
139
140                # Stop if farthest point is closer than min_length
141                add = True
142                if min_length:
143                    # This should only be distance to the first branchpoint
144                    # from the tip since we set other weights to zero
145                    le = sum(SG.es[eids].get_attribute_values('weight'))
146                    if le < min_length:
147                        add = False
148
149                if add:
150                    # Add these new edges
151                    new_edges = np.vstack((cc[path[:-1]], cc[path[1:]])).T
152                    edges = np.append(edges, new_edges).reshape(-1, 2)
153
154                # Invalidate points in the path
155                valid[path] = False
156                paths[path] = 0
157
158                # Must set weights along path to 0 so that this path is
159                # taken again in future iterations
160                SG.es[eids]['weight'] = 0
161
162                # Get all nodes within `inv_dist` to this path
163                # Note: can we somehow only include still valid nodes to speed
164                # things up?
165                dist, _, sources = dijkstra(sp, directed=False, indices=path,
166                                            limit=inv_dist, min_only=True,
167                                            return_predecessors=True)
168
169                # Invalidate
170                in_dist = dist <= inv_dist
171                to_invalidate = np.where(in_dist)[0]
172                valid[to_invalidate] = False
173                paths[to_invalidate] = 0
174
175                # Update mesh vertex to skeleton node map
176                mesh_map[cc[in_dist]] = cc[sources[in_dist]]
177
178                pbar.update((~valid).sum() - invalidated)
179                invalidated = (~valid).sum()
180
181    # Make unique edges (paths will have overlapped!)
182    edges = unique(edges, axis=0)
183
184    # Create a directed acyclic and hierarchical graph
185    G_nx = edges_to_graph(edges=edges[:, [1, 0]],
186                          fix_tree=True, fix_edges=False,
187                          weight=False)
188
189    # Generate the SWC table
190    swc, new_ids = make_swc(G_nx, coords=mesh.vertices, reindex=True)
191
192    # Update vertex to node ID map
193    mesh_map = np.array([new_ids.get(n, -1) for n in mesh_map])
194
195    return Skeleton(swc=swc, mesh=mesh, mesh_map=mesh_map, method='teasar')

Skeletonize a mesh mesh using the TEASAR algorithm [1].

This algorithm finds the longest path from a root vertex, invalidates all vertices that are within inv_dist. Then picks the second longest (and still valid) path and does the same. Rinse & repeat until all vertices have been invalidated. It's fast + works very well with tubular meshes, and with inv_dist you have control over the level of detail. Note that by its nature the skeleton will be exactly on the surface of the mesh.

Based on the implementation by Sven Dorkenwald, Casey Schneider-Mizell and Forrest Collman in meshparty (https://github.com/sdorkenw/MeshParty).

Parameters
  • mesh (mesh obj): The mesh to be skeletonize. Can an object that has .vertices and .faces properties (e.g. a trimesh.Trimesh) or a tuple (vertices, faces) or a dictionary {'vertices': vertices, 'faces': faces}.
  • inv_dist (int | float): Distance along the mesh used for invalidation of vertices. This controls how detailed (or noisy) the skeleton will be.
  • min_length (float, optional): If provided, will skip any branch that is shorter than min_length. Use this to get rid of noise but note that it will lead to vertices not being mapped to skeleton nodes. Such vertices will show up with index -1 in Skeleton.mesh_map.
  • root (int, optional): Vertex ID of a root. If not provided will use 0.
  • progress (bool, optional): If True, will show progress bar.
Returns
  • skeletor.Skeleton: Holds results of the skeletonization and enables quick visualization.
References

[1] Sato, M., Bitter, I., Bender, M. A., Kaufman, A. E., & Nakajima, M. (n.d.). TEASAR: tree-structure extraction algorithm for accurate and robust skeletons. In Proceedings the Eighth Pacific Conference on Computer Graphics and Applications. IEEE Comput. Soc. https://doi.org/10.1109/pccga.2000.883951

def by_wavefront( mesh, waves=1, origins=None, step_size=1, radius_agg='mean', progress=True):
 39def by_wavefront(mesh,
 40                 waves=1,
 41                 origins=None,
 42                 step_size=1,
 43                 radius_agg='mean',
 44                 progress=True):
 45    """Skeletonize a mesh using wave fronts.
 46
 47    The algorithm tries to find rings of vertices and collapse them to
 48    their center. This is done by propagating a wave across the mesh starting at
 49    a single seed vertex. As the wave travels across the mesh we keep track of
 50    which vertices are are encountered at each step. Groups of connected
 51    vertices that are "hit" by the wave at the same time are considered rings
 52    and subsequently collapsed. By its nature this works best with tubular meshes.
 53
 54    Parameters
 55    ----------
 56    mesh :          mesh obj
 57                    The mesh to be skeletonize. Can an object that has
 58                    ``.vertices`` and ``.faces`` properties  (e.g. a
 59                    trimesh.Trimesh) or a tuple ``(vertices, faces)`` or a
 60                    dictionary ``{'vertices': vertices, 'faces': faces}``.
 61    waves :         int
 62                    Number of waves to run across the mesh. Each wave is
 63                    initialized at a different vertex which produces slightly
 64                    different rings. The final skeleton is produced from a mean
 65                    across all waves. More waves produce higher resolution
 66                    skeletons but also introduce more noise.
 67    origins :       int | list of ints, optional
 68                    Vertex ID(s) where the wave(s) are initialized. If we run
 69                    out of origins (either because less `origins` than `waves`
 70                    or because no origin for one of the connected components)
 71                    will fall back to semi-random origin.
 72    step_size :     int
 73                    Values greater 1 effectively lead to binning of rings. For
 74                    example a stepsize of 2 means that two adjacent vertex rings
 75                    will be collapsed to the same center. This can help reduce
 76                    noise in the skeleton (and as such counteracts a large
 77                    number of waves).
 78    radius_agg :    "mean" | "median" | "max" | "min" | "percentile75" | "percentile25"
 79                    Function used to aggregate radii over sample (i.e. the
 80                    vertices forming a ring that we collapse to its center).
 81    progress :      bool
 82                    If True, will show progress bar.
 83
 84    Returns
 85    -------
 86    skeletor.Skeleton
 87                    Holds results of the skeletonization and enables quick
 88                    visualization.
 89
 90    """
 91    agg_map = {'mean': np.mean, 'max': np.max, 'min': np.min,
 92               'median': np.median,
 93               'percentile75': lambda x: np.percentile(x, 75),
 94               'percentile25': lambda x: np.percentile(x, 25)}
 95    assert radius_agg in agg_map, f'Unknown `radius_agg`: "{radius_agg}"'
 96    rad_agg_func = agg_map[radius_agg]
 97
 98    mesh = make_trimesh(mesh, validate=False)
 99
100    centers_final, radii_final, G = _cast_waves(mesh, waves=waves,
101                                                origins=origins,
102                                                step_size=step_size,
103                                                rad_agg_func=rad_agg_func,
104                                                progress=progress)
105
106    # Collapse vertices into nodes
107    (node_centers,
108     vertex_to_node_map) = np.unique(centers_final,
109                                     return_inverse=True, axis=0)
110
111    # Map radii for individual vertices to the collapsed nodes
112    # Using pandas is the fastest way here
113    node_radii = pd.DataFrame()
114    node_radii['node_id'] = vertex_to_node_map
115    node_radii['radius'] = radii_final
116    node_radii = node_radii.groupby('node_id').radius.apply(rad_agg_func).values
117
118    # Contract vertices
119    G.contract_vertices(vertex_to_node_map)
120
121    # Remove self loops and duplicate edges
122    G = G.simplify()
123
124    # Generate hierarchical tree
125    el = np.array(G.get_edgelist())
126
127    if PRESERVE_BACKBONE:
128        # Use the minimum radius between vertices in an edge
129        weights_rad = np.vstack((node_radii[el[:, 0]],
130                                 node_radii[el[:, 1]])).mean(axis=0)
131
132        # For each node generate a vector based on its immediate neighbors
133        vect, alpha = dotprops(node_centers)
134        weights_alpha = np.vstack((alpha[el[:, 0]],
135                                   alpha[el[:, 1]])).mean(axis=0)
136
137        # Combine both which means we are most likely to cut at small branches
138        # outside of the backbone
139        weights = weights_rad * weights_alpha
140
141        # MST doesn't like 0 for weights
142        weights[weights <= 0] = weights[weights > 0].min() / 2
143
144    else:
145        weights = np.linalg.norm(node_centers[el[:, 0]] - node_centers[el[:, 1]], axis=1)
146    tree = G.spanning_tree(weights=1 / weights)
147
148    # Create a directed acyclic and hierarchical graph
149    G_nx = edges_to_graph(edges=np.array(tree.get_edgelist()),
150                          nodes=np.arange(0, len(G.vs)),
151                          fix_tree=True,  # this makes sure graph is oriented
152                          drop_disconnected=False)
153
154    # Generate the SWC table
155    swc = make_swc(G_nx, coords=node_centers, reindex=False, validate=True)
156    swc['radius'] = node_radii[swc.node_id.values]
157    _, new_ids = reindex_swc(swc, inplace=True)
158
159    # Update vertex to node ID map
160    vertex_to_node_map = np.array([new_ids[n] for n in vertex_to_node_map])
161
162    return Skeleton(swc=swc, mesh=mesh, mesh_map=vertex_to_node_map,
163                    method='wavefront')

Skeletonize a mesh using wave fronts.

The algorithm tries to find rings of vertices and collapse them to their center. This is done by propagating a wave across the mesh starting at a single seed vertex. As the wave travels across the mesh we keep track of which vertices are are encountered at each step. Groups of connected vertices that are "hit" by the wave at the same time are considered rings and subsequently collapsed. By its nature this works best with tubular meshes.

Parameters
  • mesh (mesh obj): The mesh to be skeletonize. Can an object that has .vertices and .faces properties (e.g. a trimesh.Trimesh) or a tuple (vertices, faces) or a dictionary {'vertices': vertices, 'faces': faces}.
  • waves (int): Number of waves to run across the mesh. Each wave is initialized at a different vertex which produces slightly different rings. The final skeleton is produced from a mean across all waves. More waves produce higher resolution skeletons but also introduce more noise.
  • origins (int | list of ints, optional): Vertex ID(s) where the wave(s) are initialized. If we run out of origins (either because less origins than waves or because no origin for one of the connected components) will fall back to semi-random origin.
  • step_size (int): Values greater 1 effectively lead to binning of rings. For example a stepsize of 2 means that two adjacent vertex rings will be collapsed to the same center. This can help reduce noise in the skeleton (and as such counteracts a large number of waves).
  • radius_agg ("mean" | "median" | "max" | "min" | "percentile75" | "percentile25"): Function used to aggregate radii over sample (i.e. the vertices forming a ring that we collapse to its center).
  • progress (bool): If True, will show progress bar.
Returns
  • skeletor.Skeleton: Holds results of the skeletonization and enables quick visualization.
def by_vertex_clusters(mesh, sampling_dist, cluster_pos='median', progress=True):
 40def by_vertex_clusters(mesh, sampling_dist, cluster_pos='median', progress=True):
 41    """Skeletonize a (contracted) mesh by clustering vertices.
 42
 43    The algorithm traverses the mesh graph and groups vertices together that
 44    are within a given distance to each other. This uses the geodesic
 45    (along-the-mesh) distance, not simply the Eucledian distance. Subsequently
 46    these groups of vertices are collapsed and re-connected respecting the
 47    topology of the input mesh.
 48
 49    The graph traversal is fast and scales well, so this method is well suited
 50    for meshes with lots of vertices. On the downside: this implementation is
 51    not very clever and you might have to play around with the parameters
 52    (mostly ``sampling_dist``) to get decent results.
 53
 54    Parameters
 55    ----------
 56    mesh :          mesh obj
 57                    The mesh to be skeletonize. Can an object that has
 58                    ``.vertices`` and ``.faces`` properties  (e.g. a
 59                    trimesh.Trimesh) or a tuple ``(vertices, faces)`` or a
 60                    dictionary ``{'vertices': vertices, 'faces': faces}``.
 61    sampling_dist : float | int
 62                    Maximal distance at which vertices are clustered. This
 63                    parameter should be tuned based on the resolution of your
 64                    mesh (see Examples).
 65    cluster_pos :   "median" | "center"
 66                    How to determine the x/y/z coordinates of the collapsed
 67                    vertex clusters (i.e. the skeleton's nodes)::
 68
 69                      - "median": Use the vertex closest to cluster's center of
 70                        mass.
 71                      - "center": Use the center of mass. This makes for smoother
 72                        skeletons but can lead to nodes outside the mesh.
 73    progress :      bool
 74                    If True, will show progress bar.
 75
 76    Examples
 77    --------
 78    >>> import skeletor as sk
 79    >>> mesh = sk.example_mesh()
 80    >>> cont = sk.pre.contract(mesh, epsilon=0.1)
 81    >>> skel = sk.skeletonize.vertex_cluster(cont)
 82    >>> skel.mesh = mesh
 83
 84    Returns
 85    -------
 86    skeletor.Skeleton
 87                    Holds results of the skeletonization and enables quick
 88                    visualization.
 89
 90    """
 91    assert cluster_pos in ['center', 'median']
 92
 93    mesh = make_trimesh(mesh, validate=False)
 94
 95    # Produce weighted edges
 96    edges = np.concatenate((mesh.edges_unique,
 97                            mesh.edges_unique_length.reshape(mesh.edges_unique.shape[0], 1)),
 98                           axis=1)
 99
100    # Generate Graph (must be undirected)
101    G = nx.Graph()
102    G.add_weighted_edges_from(edges)
103
104    # Run the graph traversal that groups vertices into spatial clusters
105    not_visited = set(G.nodes)
106    seen = set()
107    clusters = []
108    to_visit = len(not_visited)
109    with tqdm(desc='Clustering', total=len(not_visited), disable=progress is False) as pbar:
110        while not_visited:
111            # Pick a random node
112            start = not_visited.pop()
113            # Get all nodes in the geodesic vicinity
114            cl, seen = dfs(G, n=start, dist_traveled=0,
115                           max_dist=sampling_dist, seen=seen)
116            cl = set(cl)
117
118            # Append this cluster and track visited/not-visited nodes
119            clusters.append(cl)
120            not_visited = not_visited - cl
121
122            # Update  progress bar
123            pbar.update(to_visit - len(not_visited))
124            to_visit = len(not_visited)
125
126    # `clusters` is a list of sets -> let's turn it into list of arrays
127    clusters = [np.array(list(c)).astype(int) for c in clusters]
128
129    # Get positions of clusters
130    if cluster_pos == 'center':
131        # Get the center of each cluster
132        cl_coords = np.array([np.mean(mesh.vertices[c], axis=0) for c in clusters])
133    elif cluster_pos == 'median':
134        # Get the node that's closest to to the clusters center
135        cl_coords = []
136        for c in clusters:
137            cnt = np.mean(mesh.vertices[c], axis=0)
138            cnt_dist = np.sum(np.fabs(mesh.vertices[c] - cnt), axis=1)
139            median = mesh.vertices[c][np.argmin(cnt_dist)]
140            cl_coords.append(median)
141        cl_coords = np.array(cl_coords)
142
143    # Generate edges
144    cl_edges = np.array(mesh.edges_unique)
145    if fastremap:
146        mapping = {n: i for i, l in enumerate(clusters) for n in l}
147        cl_edges = fastremap.remap(cl_edges, mapping, preserve_missing_labels=False, in_place=True)
148    else:
149        for i, c in enumerate(clusters):
150            cl_edges[np.isin(cl_edges, c)] = i
151
152    # Remove directionality from cluster edges
153    cl_edges = np.sort(cl_edges, axis=1)
154
155    # Get unique edges
156    cl_edges = np.unique(cl_edges, axis=0)
157
158    # Calculate edge lengths
159    co1 = cl_coords[cl_edges[:, 0]]
160    co2 = cl_coords[cl_edges[:, 1]]
161    cl_edge_lengths = np.sqrt(np.sum((co1 - co2)**2, axis=1))
162
163    # Produce adjacency matrix from edges and edge lengths
164    n_clusters = len(clusters)
165    adj = scipy.sparse.coo_matrix((cl_edge_lengths,
166                                   (cl_edges[:, 0], cl_edges[:, 1])),
167                                  shape=(n_clusters, n_clusters))
168
169    # The cluster graph likely still contain cycles, let's get rid of them using
170    # a minimum spanning tree
171    mst = scipy.sparse.csgraph.minimum_spanning_tree(adj,
172                                                     overwrite=True)
173
174    # Turn into COO matrix
175    coo = mst.tocoo()
176
177    # Extract edge list
178    edges = np.array([coo.row, coo.col]).T
179
180    # Produce final graph - this also takes care of some fixing
181    G = edges_to_graph(edges, nodes=np.unique(cl_edges.flatten()),
182                       drop_disconnected=False, fix_tree=True)
183
184    # Generate a mesh vertex -> skeleton vertex map
185    # Note that nodes are labeled by index of the cluster
186    vertex_to_node_map = [i for i, cl in enumerate(clusters) for n in cl]
187
188    # Generate SWC
189    swc, new_ids = make_swc(G, cl_coords, reindex=True, validate=False)
190
191    # Update mesh map
192    vertex_to_node_map = np.array([new_ids[n] for n in vertex_to_node_map])
193
194    return Skeleton(swc=swc, mesh=mesh, mesh_map=vertex_to_node_map,
195                    method='vertex_clusters')

Skeletonize a (contracted) mesh by clustering vertices.

The algorithm traverses the mesh graph and groups vertices together that are within a given distance to each other. This uses the geodesic (along-the-mesh) distance, not simply the Eucledian distance. Subsequently these groups of vertices are collapsed and re-connected respecting the topology of the input mesh.

The graph traversal is fast and scales well, so this method is well suited for meshes with lots of vertices. On the downside: this implementation is not very clever and you might have to play around with the parameters (mostly sampling_dist) to get decent results.

Parameters
  • mesh (mesh obj): The mesh to be skeletonize. Can an object that has .vertices and .faces properties (e.g. a trimesh.Trimesh) or a tuple (vertices, faces) or a dictionary {'vertices': vertices, 'faces': faces}.
  • sampling_dist (float | int): Maximal distance at which vertices are clustered. This parameter should be tuned based on the resolution of your mesh (see Examples).
  • cluster_pos ("median" | "center"): How to determine the x/y/z coordinates of the collapsed vertex clusters (i.e. the skeleton's nodes)::

    • "median": Use the vertex closest to cluster's center of mass.
    • "center": Use the center of mass. This makes for smoother skeletons but can lead to nodes outside the mesh.
  • progress (bool): If True, will show progress bar.
Examples
>>> import skeletor as sk
>>> mesh = sk.example_mesh()
>>> cont = sk.pre.contract(mesh, epsilon=0.1)
>>> skel = sk.skeletonize.vertex_cluster(cont)
>>> skel.mesh = mesh
Returns
  • skeletor.Skeleton: Holds results of the skeletonization and enables quick visualization.
def by_edge_collapse(mesh, shape_weight=1, sample_weight=0.1, progress=True):
 38def by_edge_collapse(mesh, shape_weight=1, sample_weight=0.1, progress=True):
 39    """Skeletonize a (contracted) mesh by iteratively collapsing edges.
 40
 41    This algorithm (described in [1]) iteratively collapses edges that are part
 42    of a face until no more faces are left. Edges are chosen based on a cost
 43    function that penalizes collapses that would change the shape of the object
 44    or would introduce long edges.
 45
 46    This is somewhat sensitive to the dimensions of the input mesh: too large
 47    and you might experience slow-downs or numpy OverflowErrors; too low and
 48    you might get skeletons that don't quite match the mesh (e.g. too few nodes).
 49    If you experience either, try down- or up-scaling your mesh, respectively.
 50
 51    Parameters
 52    ----------
 53    mesh :          mesh obj
 54                    The mesh to be skeletonize. Can an object that has
 55                    ``.vertices`` and ``.faces`` properties  (e.g. a
 56                    trimesh.Trimesh) or a tuple ``(vertices, faces)`` or a
 57                    dictionary ``{'vertices': vertices, 'faces': faces}``.
 58    shape_weight :  float, optional
 59                    Weight for shape costs which penalize collapsing edges that
 60                    would drastically change the shape of the object.
 61    sample_weight : float, optional
 62                    Weight for sampling costs which penalize collapses that
 63                    would generate prohibitively long edges.
 64    progress :      bool
 65                    If True, will show progress bar.
 66
 67    Returns
 68    -------
 69    skeletor.Skeleton
 70                    Holds results of the skeletonization and enables quick
 71                    visualization.
 72
 73    References
 74    ----------
 75    [1] Au OK, Tai CL, Chu HK, Cohen-Or D, Lee TY. Skeleton extraction by mesh
 76        contraction. ACM Transactions on Graphics (TOG). 2008 Aug 1;27(3):44.
 77
 78    """
 79    mesh = make_trimesh(mesh, validate=False)
 80
 81    # Shorthand faces and edges
 82    # We convert to arrays to (a) make a copy and (b) remove potential overhead
 83    # from these originally being trimesh TrackedArrays
 84    edges = np.array(mesh.edges_unique)
 85    verts = np.array(mesh.vertices)
 86
 87    # For cost calculations we will normalise coordinates
 88    # This prevents getting ridiculuously large cost values ?e300
 89    # verts = (verts - verts.min()) / (verts.max() - verts.min())
 90    edge_lengths = np.sqrt(np.sum((verts[edges[:, 0]] - verts[edges[:, 1]])**2, axis=1))
 91
 92    # Get a list of faces: [(edge1, edge2, edge3), ...]
 93    face_edges = np.array(mesh.faces_unique_edges)
 94    # Make sure these faces are unique, i.e. no [(e1, e2, e3), (e3, e2, e1)]
 95    face_edges = np.unique(np.sort(face_edges, axis=1), axis=0)
 96
 97    # Shape cost initialisation:
 98    # Each vertex has a matrix Q which is used to determine the shape cost
 99    # of collapsing each node. We need to generate a matrix (Q) for each
100    # vertex, then when we collapse two nodes, we can update using
101    # Qj <- Qi + Qj, so the edges previously associated with vertex i
102    # are now associated with vertex j.
103
104    # For each edge, generate a matrix (K). K is made up of two sets of
105    # coordinates in 3D space, a and b. a is the normalised edge vector
106    # of edge(i,j) and b = a * <x/y/z coordinates of vertex i>
107    #
108    # The matrix K takes the form:
109    #
110    #        Kij = 0, -az, ay, -bx
111    #              az, 0, -ax, -by
112    #             -ay, ax, 0,  -bz
113
114    edge_co0, edge_co1 = verts[edges[:, 0]], verts[edges[:, 1]]
115    a = (edge_co1 - edge_co0) / edge_lengths.reshape(edges.shape[0], 1)
116    # Note: It's a bit unclear to me whether the normalised edge vector should
117    # be allowed to have negative values but I seem to be getting better
118    # results if I use absolute values
119    a = np.fabs(a)
120    b = a * edge_co0
121
122    # Bunch of zeros
123    zero = np.zeros(a.shape[0])
124
125    # Generate matrix K
126    K = [[zero,    -a[:, 2], a[:, 1],    -b[:, 0]],
127         [a[:, 2],  zero,    -a[:, 0],   -b[:, 1]],
128         [-a[:, 1], a[:, 0], zero,       -b[:, 2]]]
129    K = np.array(K)
130
131    # Q for vertex i is then the sum of the products of (kT,k) for ALL edges
132    # connected to vertex i:
133    # Initialize matrix of correct shape
134    Q_array = np.zeros((4, 4, verts.shape[0]), dtype=np.float64)
135
136    # Generate (kT, K)
137    kT = np.transpose(K, axes=(1, 0, 2))
138
139    # To get the sum of the products in the correct format we have to
140    # do some annoying transposes to get to (4, 4, len(edges))
141    K_dot = np.matmul(K.T, kT.T).T
142
143    # Iterate over all vertices
144    for v in range(len(verts)):
145        # Find edges that contain this vertex
146        cond1 = edges[:, 0] == v
147        cond2 = edges[:, 1] == v
148        # Note that this does not take directionality of edges into account
149        # Not sure if that's intended?
150
151        # Get indices of these edges
152        indices = np.where(cond1 | cond2)[0]
153
154        # Get the products for all edges adjacent to mesh
155        Q = K_dot[:, :, indices]
156        # Sum over all edges
157        Q = Q.sum(axis=2)
158        # Add to Q array
159        Q_array[:, :, v] = Q
160
161    # Not sure if we are doing something wrong when calculating the Q array but
162    # we end up having negative values which translate into negative scores.
163    # This in turn is bad because we propagate that negative score when
164    # collapsing edges which leads to a "zipper-effect" where nodes collapse
165    # in sequence a->b->c->d until they hit some node with really high cost
166    # Q_array -= Q_array.min()
167
168    # Edge collapse:
169    # Determining which edge to collapse is a weighted sum of the shape and
170    # sampling cost. The shape cost of vertex i is Fa(p) = pT Qi p where p is
171    # the coordinates of point p (vertex i here) in homogeneous representation.
172    # The variable w from above is the value for the homogeneous 4th dimension.
173    # T denotes transpose of matrix.
174    # The shape cost of collapsing the edge Fa(i,j) = Fi(vj) + Fj(vj).
175    # vi and vj being the coordinates of the vertex in homogeneous representation
176    # (p in equation before)
177    # The sampling cost penalises edge collapses that generate overly long edges,
178    # based on the distance traveled by all edges to vi, when vi is merged with
179    # vj. (Eq. 7 in paper)
180    # You cannot collapse an edge (i -> j) if k is a common adjacent vertex of
181    # both i and j, but (i/j/k) is not a face.
182    # We will set the cost of these edges to infinity.
183
184    # Now work out the shape cost of collapsing each node (eq. 7)
185    # First get coordinates of the first node of each edge
186    # Note that in Nik's implementation this was the second node
187    p = verts[edges[:, 0]]
188
189    # Append weight factor
190    w = 1
191    p = np.append(p, np.full((p.shape[0], 1), w), axis=1)
192
193    this_Q1 = Q_array[:, :, edges[:, 0]]
194    this_Q2 = Q_array[:, :, edges[:, 1]]
195
196    F1 = np.einsum('ij,kji->ij', p, this_Q1)[:, [0, 1]]
197    F2 = np.einsum('ij,kji->ij', p, this_Q2)[:, [0, 1]]
198
199    # Calculate and append shape cost
200    F = np.append(F1, F2, axis=1)
201    shape_cost = np.sum(F, axis=1)
202
203    # Sum lengths of all edges associated with a given vertex
204    # This is easiest by generating a sparse matrix from the edges
205    # and then summing by row
206    adj = scipy.sparse.coo_matrix((edge_lengths,
207                                   (edges[:, 0], edges[:, 1])),
208                                  shape=(verts.shape[0], verts.shape[0]))
209
210    # This makes sure the matrix is symmetrical, i.e. a->b == a<-b
211    # Note that I'm not sure whether this is strictly necessary but it really
212    # can't hurt
213    adj = adj + adj.T
214
215    # Get the lengths associated with each vertex
216    verts_lengths = adj.sum(axis=1)
217
218    # We need to flatten this (something funny with summing sparse matrices)
219    verts_lengths = np.array(verts_lengths).flatten()
220
221    # Map the sum of vertex lengths onto edges (as per first vertex in edge)
222    ik_edge = verts_lengths[edges[:, 0]]
223
224    # Calculate sampling cost
225    sample_cost = edge_lengths * (ik_edge - edge_lengths)
226
227    # Determine which edge to collapse and collapse it
228    # Total Cost - weighted sum of shape and sample cost, equation 8 in paper
229    F_T = shape_cost * shape_weight + sample_cost * sample_weight
230
231    # Now start collapsing edges one at a time
232    face_count = face_edges.shape[0]  # keep track of face counts for progress bar
233    is_collapsed = np.full(edges.shape[0], False)
234    keep = np.full(edges.shape[0], False)
235    with tqdm(desc='Collapsing edges', total=face_count, disable=progress is False) as pbar:
236        while face_edges.size:
237            # Uncomment to get a more-or-less random edge collapse
238            # F_T[:] = 0
239
240            # Update progress bar
241            pbar.update(face_count - face_edges.shape[0])
242            face_count = face_edges.shape[0]
243
244            # This has to come at the beginning of the loop
245            # Set cost of collapsing edges without faces to infinite
246            F_T[keep] = np.inf
247            F_T[is_collapsed] = np.inf
248
249            # Get the edge that we want to collapse
250            collapse_ix = np.argmin(F_T)
251            # Get the vertices this edge connects
252            u, v = edges[collapse_ix]
253            # Get all edges that contain these vertices:
254            # First, edges that are (uv, x)
255            connects_uv = np.isin(edges[:, 0], [u, v])
256            # Second, check if any (uv, x) edges are (uv, uv)
257            connects_uv[connects_uv] = np.isin(edges[connects_uv, 1], [u, v])
258
259            # Remove uu and vv edges
260            uuvv = edges[:, 0] == edges[:, 1]
261            connects_uv = connects_uv & ~uuvv
262            # Get the edge's indices
263            clps_edges = np.where(connects_uv)[0]
264
265            # Now find find the faces the collapsed edge is part of
266            # Note: splitting this into three conditions is marginally faster than
267            # np.any(np.isin(face_edges, clps_edges), axis=1)
268            uv0 = np.isin(face_edges[:, 0], clps_edges)
269            uv1 = np.isin(face_edges[:, 1], clps_edges)
270            uv2 = np.isin(face_edges[:, 2], clps_edges)
271            has_uv = uv0 | uv1 | uv2
272
273            # If these edges do not have adjacent faces anymore
274            if not np.any(has_uv):
275                # Track this edge as a keeper
276                keep[clps_edges] = True
277                continue
278
279            # Get the collapsed faces [(e1, e2, e3), ...] for this edge
280            clps_faces = face_edges[has_uv]
281
282            # Remove the collapsed faces
283            face_edges = face_edges[~has_uv]
284
285            # Track these edges as collapsed
286            is_collapsed[clps_edges] = True
287
288            # Get the adjacent edges (i.e. non-uv edges)
289            adj_edges = clps_faces[~np.isin(clps_faces, clps_edges)].reshape(clps_faces.shape[0], 2)
290
291            # We have to do some sorting and finding unique edges to make sure
292            # remapping is done correctly further down
293            # NOTE: Not sure we really need this, so leaving it out for now
294            # adj_edges = np.unique(np.sort(adj_edges, axis=1), axis=0)
295
296            # We need to keep track of changes to the adjacent faces
297            # Basically each face in (i, j, k) will be reduced to one edge
298            # which points from u -> v
299            # -> replace occurrences of loosing edge with winning edge
300            for win, loose in adj_edges:
301                if fastremap:
302                    face_edges = fastremap.remap(face_edges, {loose: win},
303                                                 preserve_missing_labels=True,
304                                                 in_place=True)
305                else:
306                    face_edges[face_edges == loose] = win
307                is_collapsed[loose] = True
308
309            # Replace occurrences of first node u with second node v
310            if fastremap:
311                edges = fastremap.remap(edges, {u: v},
312                                        preserve_missing_labels=True,
313                                        in_place=True)
314            else:
315                edges[edges == u] = v
316
317            # Add shape cost of u to shape costs of v
318            Q_array[:, :, v] += Q_array[:, :, u]
319
320            # Determine which edges require update of costs:
321            # In theory we only need to update costs for edges that are
322            # associated with vertices v and u (which now also v)
323            has_v = (edges[:, 0] == v) | (edges[:, 1] == v)
324
325            # Uncomment to temporarily force updating costs for all edges
326            # has_v[:] = True
327
328            # Update shape costs
329            this_Q1 = Q_array[:, :, edges[has_v, 0]]
330            this_Q2 = Q_array[:, :, edges[has_v, 1]]
331
332            F1 = np.einsum('ij,kji->ij', p[edges[has_v, 0]], this_Q1)[:, [0, 1]]
333            F2 = np.einsum('ij,kji->ij', p[edges[has_v, 1]], this_Q2)[:, [0, 1]]
334
335            F = np.append(F1, F2, axis=1)
336            new_shape_cost = np.sum(F, axis=1)
337
338            # Update sum of incoming edge lengths
339            # Technically we would have to recalculate lengths of adjacent edges
340            # every time but we will take the cheap way out and simply add them up
341            verts_lengths[v] += verts_lengths[u]
342            # Update sample costs for edges associated with v
343            ik_edge = verts_lengths[edges[has_v, 0]]
344            new_sample_cost = edge_lengths[has_v] * (ik_edge - edge_lengths[has_v])
345
346            F_T[has_v] = new_shape_cost * shape_weight + new_sample_cost * sample_weight
347
348    # After the edge collapse, the edges are garbled - I have yet to figure out
349    # why and whether that can be prevented. However the vertices in those
350    # edges are correct and so we just need to reconstruct their connectivity
351    # by extracting a minimum spanning tree over the mesh.
352    corrected_edges = mst_over_mesh(mesh, edges[keep].flatten())
353
354    # Generate graph
355    G = edges_to_graph(corrected_edges, vertices=mesh.vertices, fix_tree=True,
356                       weight=False, drop_disconnected=False)
357
358    swc, new_ids = make_swc(G, mesh, reindex=True)
359
360    return Skeleton(swc=swc, mesh=mesh, mesh_map=None, method='edge_collapse')

Skeletonize a (contracted) mesh by iteratively collapsing edges.

This algorithm (described in [1]) iteratively collapses edges that are part of a face until no more faces are left. Edges are chosen based on a cost function that penalizes collapses that would change the shape of the object or would introduce long edges.

This is somewhat sensitive to the dimensions of the input mesh: too large and you might experience slow-downs or numpy OverflowErrors; too low and you might get skeletons that don't quite match the mesh (e.g. too few nodes). If you experience either, try down- or up-scaling your mesh, respectively.

Parameters
  • mesh (mesh obj): The mesh to be skeletonize. Can an object that has .vertices and .faces properties (e.g. a trimesh.Trimesh) or a tuple (vertices, faces) or a dictionary {'vertices': vertices, 'faces': faces}.
  • shape_weight (float, optional): Weight for shape costs which penalize collapsing edges that would drastically change the shape of the object.
  • sample_weight (float, optional): Weight for sampling costs which penalize collapses that would generate prohibitively long edges.
  • progress (bool): If True, will show progress bar.
Returns
  • skeletor.Skeleton: Holds results of the skeletonization and enables quick visualization.
References

[1] Au OK, Tai CL, Chu HK, Cohen-Or D, Lee TY. Skeleton extraction by mesh contraction. ACM Transactions on Graphics (TOG). 2008 Aug 1;27(3):44.

def by_tangent_ball(mesh):
125def by_tangent_ball(mesh):
126    """Skeletonize a mesh by finding the maximal tangent ball.
127
128    This algorithm casts a ray from every mesh vertex along its inverse normals
129    (requires `ncollpyde`). It then creates a sphere that is tangent to the
130    vertex and to where the ray hit the inside of a face on the opposite side.
131    Next it drops spheres that overlap with another, larger sphere. Modified
132    from [1].
133
134    The method works best on smooth meshes and is rather sensitive to errors in
135    the mesh such as incorrect normals (see `skeletor.pre.fix_mesh`), internal
136    faces, noisy surface (try smoothing or downsampling) or holes in the mesh.
137
138    Parameters
139    ----------
140    mesh :              mesh obj
141                        The mesh to be skeletonize. Can an object that has
142                        ``.vertices`` and ``.faces`` properties  (e.g. a
143                        trimesh.Trimesh) or a tuple ``(vertices, faces)`` or a
144                        dictionary ``{'vertices': vertices, 'faces': faces}``.
145
146    Returns
147    -------
148    skeletor.Skeleton
149                        Holds results of the skeletonization and enables quick
150                        visualization.
151
152    Examples
153    --------
154    >>> import skeletor as sk
155    >>> mesh = sk.example_mesh()
156    >>> fixed = sk.pre.fix_mesh(mesh, fix_normals=True, remove_disconnected=10)
157    >>> skel = sk.skeletonize.by_tangent_ball(fixed)
158
159    References
160    ----------
161    [1] Ma, J., Bae, S.W. & Choi, S. 3D medial axis point approximation using
162        nearest neighbors and the normal field. Vis Comput 28, 7–19 (2012).
163        https://doi.org/10.1007/s00371-011-0594-7
164
165    """
166    mesh = make_trimesh(mesh, validate=False)
167
168    # Generate the KD tree
169    tree = scipy.spatial.cKDTree(mesh.vertices)
170
171    dist = tree.query(mesh.vertices, k=2)[0][:, 1]
172
173    centers = np.zeros(mesh.vertices.shape)
174    radii = np.zeros(mesh.vertices.shape[0])
175
176    coll = ncollpyde.Volume(mesh.vertices, mesh.faces, validate=False)
177    sources = mesh.vertices - mesh.vertex_normals * 0.01
178    targets = mesh.vertices - mesh.vertex_normals * (dist.max() * 10)
179    ix, loc, is_backface = coll.intersections(sources, targets)
180
181    # Now we need to invalidate centers
182    intersects = np.zeros(mesh.vertices.shape[0]).astype(bool)
183    intersects[ix[is_backface]] = True
184    centers[ix] = mesh.vertices[ix] + (loc - mesh.vertices[ix]) / 2
185    radii[ix] = np.linalg.norm(loc - mesh.vertices[ix], axis=1) / 2
186
187    # Now we need to post processing
188    inv = np.zeros(mesh.vertices.shape[0]).astype(bool)
189
190    # Invalidate vertices that didn't intersect
191    inv[~intersects] = True
192
193    # Now invalidate any ball that is outside the mesh
194    inv[~coll.contains(centers)] = True
195
196    # Find tangent balls that are fully contained in another tangent ball
197    # (those are not maximal inscribed)
198    original_ind = np.arange(mesh.vertices.shape[0])
199    while True:
200        tree2 = scipy.spatial.cKDTree(centers[~inv])
201        # For any not-yet-invalidated center find the closest other center
202        dist, ix = tree2.query(centers[~inv], k=2)
203
204        # Drop self-hits
205        ix, dist = ix[:, 1], dist[:, 1]
206
207        # In radius
208        in_radius = dist < radii[~inv]
209
210        # Stop if no more overlapping pairs
211        if not in_radius.any():
212            break
213
214        # Collect radii to determine which of the overlapping ball survives
215        pair_rad = np.vstack((radii[~inv][in_radius],
216                              radii[~inv][ix[in_radius]])).T
217        pair_ix = np.vstack((original_ind[~inv][in_radius],
218                             original_ind[~inv][ix[in_radius]])).T
219
220        # Invalidate the loosers
221        looses = np.argmax(pair_rad, axis=1)
222        looser_ix = np.unique(pair_ix[np.arange(pair_ix.shape[0]), looses])
223        inv[looser_ix] = True
224
225    # Now we need to collapse nodes into the remaining centers
226    G = ig.Graph(n=mesh.vertices.shape[0],
227                 edges=mesh.edges_unique,
228                 directed=False)
229
230    # Make sure that every connected component has at least one valid target
231    for cc in G.clusters():
232        if not np.isin(cc, original_ind[~inv]).any():
233            inv[cc[0]] = False
234            centers[cc[0]] = mesh.vertices[cc[0]]
235
236    # For each invalidated vertex, find the closest vertex that is still valid
237    # This works on unweighted edges but should be good enough - way faster
238    # than a proper path search for sure
239    pairs = find_closest(G, sources=original_ind[inv],
240                         targets=original_ind[~inv])
241
242    # Generate a mesh vertex to skeleton node map
243    mesh_map = original_ind.copy()
244    mesh_map[pairs[:, 0]] = pairs[:, 1]
245
246    # Renumber the vertices from 0 -> N_vertices
247    uni, ind, mesh_map = np.unique(mesh_map, return_inverse=True, return_index=True)
248
249    # Make sure centers and radii match the new order
250    centers = centers[uni]
251    radii = radii[uni]
252
253    # Contract vertices to nodes according to the mesh
254    G.contract_vertices(mesh_map, combine_attrs=None)
255
256    # This only drops duplicate and self-loop edges
257    G = G.simplify()
258
259    # Generate weights between remaining centers
260    el = np.array(G.get_edgelist())
261    weights = np.linalg.norm(centers[el[:, 0]] - centers[el[:, 1]], axis=1)
262
263    # Generate hierarchical tree
264    tree = G.spanning_tree(weights=weights)
265
266    # Create a directed acyclic and hierarchical graph
267    G_nx = edges_to_graph(edges=np.array(tree.get_edgelist()),
268                          nodes=np.arange(0, len(G.vs)),
269                          fix_tree=True,
270                          drop_disconnected=False)
271
272    # Generate the SWC table
273    swc = make_swc(G_nx, coords=centers, reindex=False)
274    swc['radius'] = radii[swc.node_id.values]
275    _, new_ids = reindex_swc(swc, inplace=True)
276
277    # Update vertex to node ID map
278    mesh_map = np.array([new_ids[n] for n in mesh_map])
279
280    return Skeleton(swc=swc, mesh=mesh, mesh_map=mesh_map,
281                    method='tangent_ball')

Skeletonize a mesh by finding the maximal tangent ball.

This algorithm casts a ray from every mesh vertex along its inverse normals (requires ncollpyde). It then creates a sphere that is tangent to the vertex and to where the ray hit the inside of a face on the opposite side. Next it drops spheres that overlap with another, larger sphere. Modified from [1].

The method works best on smooth meshes and is rather sensitive to errors in the mesh such as incorrect normals (see skeletor.pre.fix_mesh), internal faces, noisy surface (try smoothing or downsampling) or holes in the mesh.

Parameters
  • mesh (mesh obj): The mesh to be skeletonize. Can an object that has .vertices and .faces properties (e.g. a trimesh.Trimesh) or a tuple (vertices, faces) or a dictionary {'vertices': vertices, 'faces': faces}.
Returns
  • skeletor.Skeleton: Holds results of the skeletonization and enables quick visualization.
Examples
>>> import skeletor as sk
>>> mesh = sk.example_mesh()
>>> fixed = sk.pre.fix_mesh(mesh, fix_normals=True, remove_disconnected=10)
>>> skel = sk.skeletonize.by_tangent_ball(fixed)
References

[1] Ma, J., Bae, S.W. & Choi, S. 3D medial axis point approximation using nearest neighbors and the normal field. Vis Comput 28, 7–19 (2012). https://doi.org/10.1007/s00371-011-0594-7