On spatio-temporal feature point detection for animated meshes

Although automatic feature detection has been a long-sought subject by researchers in computer graphics and computer vision, feature extraction on deforming models remains a relatively unexplored area. In this paper, we develop a new method for automatic detection of spatio-temporal feature points o...

Full description

Saved in:
Bibliographic Details
Published inThe Visual computer Vol. 31; no. 11; pp. 1471 - 1486
Main Authors Mykhalchuk, Vasyl, Seo, Hyewon, Cordier, Frederic
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.11.2015
Springer Nature B.V
Springer Verlag
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Although automatic feature detection has been a long-sought subject by researchers in computer graphics and computer vision, feature extraction on deforming models remains a relatively unexplored area. In this paper, we develop a new method for automatic detection of spatio-temporal feature points on animated meshes. Our algorithm consists of three main parts. We first define local deformation characteristics, based on strain and curvature values computed for each point at each frame. Next, we construct multi-resolution space–time Gaussians and difference-of-Gaussian (DoG) pyramids on the deformation characteristics representing the input animated mesh, where each level contains 3D smoothed and subsampled representation of the previous level. Finally, we estimate locations and scales of spatio-temporal feature points by using a scale-normalized differential operator. A new, precise approximation of spatio-temporal scale-normalized Laplacian has been introduced, based on the space–time DoG. We have experimentally verified our algorithm on a number of examples and conclude that our technique allows to detect spatio and temporal feature points in a reliable manner.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0178-2789
1432-2315
1432-8726
DOI:10.1007/s00371-014-1027-1