A Robust 3D Mesh Segmentation Algorithm With Anisotropic Sparse Embedding

ABSTRACT 3D mesh segmentation, as a very challenging problem in computer graphics, has attracted considerable interest. The most popular methods in recent years are data‐driven methods. However, such methods require a large amount of accurately labeled data, which is difficult to obtain. In this art...

Full description

Saved in:
Bibliographic Details
Published inComputer animation and virtual worlds Vol. 36; no. 3
Main Authors Zhang, Mengyao, Li, Wenting, Zhao, Yong, Si, Xin, Zhang, Jingliang
Format Journal Article
LanguageEnglish
Published Hoboken, USA John Wiley & Sons, Inc 01.05.2025
Wiley Subscription Services, Inc
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:ABSTRACT 3D mesh segmentation, as a very challenging problem in computer graphics, has attracted considerable interest. The most popular methods in recent years are data‐driven methods. However, such methods require a large amount of accurately labeled data, which is difficult to obtain. In this article, we propose a novel mesh segmentation algorithm based on anisotropic sparse embedding. We first over‐segment the input mesh and get a collection of patches. Then these patches are embedded into a latent space via an anisotropic L1$$ {L}_1 $$‐regularized optimization problem. In the new space, the patches that belong to the same part of the mesh will be closer, while those belonging to different parts will be farther. Finally, we can easily generate the segmentation result by clustering. Various experimental results on the PSB and COSEG datasets show that our algorithm is able to get perception‐aware results and is superior to the state‐of‐the‐art algorithms. In addition, the proposed algorithm can robustly deal with meshes with different poses, different triangulations, noises, missing regions, or missing parts. A robust algorithm is introduced to segment complex 3D meshes. A large number of experimental results demonstrate that our algorithm is capable of obtaining accurate segmentation boundaries.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1546-4261
1546-427X
DOI:10.1002/cav.70042