Precise segmentation of densely interweaving neuron clusters using G-Cut
Characterizing the precise three-dimensional morphology and anatomical context of neurons is crucial for neuronal cell type classification and circuitry mapping. Recent advances in tissue clearing techniques and microscopy make it possible to obtain image stacks of intact, interweaving neuron cluste...
Saved in:
Published in | Nature communications Vol. 10; no. 1; pp. 1549 - 12 |
---|---|
Main Authors | , , , , , , , , , , , , , |
Format | Journal Article |
Language | English |
Published |
London
Nature Publishing Group UK
04.04.2019
Nature Publishing Group Nature Portfolio |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Characterizing the precise three-dimensional morphology and anatomical context of neurons is crucial for neuronal cell type classification and circuitry mapping. Recent advances in tissue clearing techniques and microscopy make it possible to obtain image stacks of intact, interweaving neuron clusters in brain tissues. As most current 3D neuronal morphology reconstruction methods are only applicable to single neurons, it remains challenging to reconstruct these clusters digitally. To advance the state of the art beyond these challenges, we propose a fast and robust method named G-Cut that is able to automatically segment individual neurons from an interweaving neuron cluster. Across various densely interconnected neuron clusters, G-Cut achieves significantly higher accuracies than other state-of-the-art algorithms. G-Cut is intended as a robust component in a high throughput informatics pipeline for large-scale brain mapping projects.
Most neuronal reconstruction software can automatically trace single neuronal morphologies but tracing multiple, densely interwoven neurons is much more challenging. Here the authors develop G-Cut, a computational approach for accurate segmentation of densely interconnected neuron clusters. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 2041-1723 2041-1723 |
DOI: | 10.1038/s41467-019-09515-0 |