Fast reconstruction of water-tight surface mesh of neurons

Neuron morphology reconstruction from high-resolution imaging data is essential for understanding the structure and function of the brain in neuroscience. However, previous methods cannot achieve both water-tight and high performance in surface mesh reconstruction of large-scale neurons. Thus, this...

Full description

Saved in:
Bibliographic Details
Published inJournal of visualization Vol. 27; no. 3; pp. 437 - 450
Main Authors Wang, Yinzhao, Li, Yuan, Tao, Yubo, Lin, Hai, Wang, Jiarun
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.06.2024
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN1343-8875
1875-8975
DOI10.1007/s12650-024-00970-6

Cover

More Information
Summary:Neuron morphology reconstruction from high-resolution imaging data is essential for understanding the structure and function of the brain in neuroscience. However, previous methods cannot achieve both water-tight and high performance in surface mesh reconstruction of large-scale neurons. Thus, this paper proposes a novel neuronal surface mesh reconstruction algorithm based on isosurface extraction, virtual memory management, and parallel computation. The space of a neuron is firstly divided into blocks, and they are organized as a sparse octree to handle large-scale neurons with long projection. We then perform voxelization and isosurface extraction on valid blocks based on the skeleton model of the neuron to ensure the generated mesh that is water-tightness, and the quality and the density of the mesh are controllable. Since each block is processed independently, the reconstruction can be performed in parallel for high performance and partially for interactive modification during neuron proofreading. Experiments demonstrate that the proposed algorithm can generate water-tight neuronal surface meshes effectively and satisfy the needs of interactive visualization and correction. Graphical Abstract
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1343-8875
1875-8975
DOI:10.1007/s12650-024-00970-6