Isometric Shape Representation by Integrating Shape Function Maps and Deep Learning
With the wide application of isometric 3D models, the representation and recognition of them receive increasing attention. Most existing methods for shape representation and analysis either focus on using the constrained hand-craft models or purely devote efforts on developing complicated deep learn...
Saved in:
Published in | IEEE access Vol. 7; pp. 158503 - 158513 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | With the wide application of isometric 3D models, the representation and recognition of them receive increasing attention. Most existing methods for shape representation and analysis either focus on using the constrained hand-craft models or purely devote efforts on developing complicated deep learning methods. In order to make full use of the advantages of both, this paper presents a novel mixture modeling approach by integrating both the shape function map (SFM) and the deep convolutional neural network (CNN). First, multiple SFM maps are constructed to grasp the rigid and non-rigid information that are usually considered separately for shape representation. Then, to fully characterize the low-level information existing in SFM, diverse sets of deep features are learned on different SFM maps during the training process of classifiers. Finally, both the rigid and non-rigid deep features are integrated for more discriminative feature abstraction. The experimental results on standard shape benchmark datasets have validated the superior performance of our proposed approach on feature extraction, classification and retrieval. Besides, our evaluation results on extensive shape datasets (e.g. noisy, CAD and protein shapes) have again verified the effectiveness of the proposed algorithm. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2019.2950279 |