Efficient similarity search for covariance matrices via the Jensen-Bregman LogDet Divergence
Covariance matrices provide compact, informative feature descriptors for use in several computer vision applications, such as people-appearance tracking, diffusion-tensor imaging, activity recognition, among others. A key task in many of these applications is to compare different covariance matrices...
Saved in:
Published in | 2011 International Conference on Computer Vision pp. 2399 - 2406 |
---|---|
Main Authors | , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.11.2011
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Covariance matrices provide compact, informative feature descriptors for use in several computer vision applications, such as people-appearance tracking, diffusion-tensor imaging, activity recognition, among others. A key task in many of these applications is to compare different covariance matrices using a (dis)similarity function. A natural choice here is the Riemannian metric corresponding to the manifold inhabited by covariance matrices. But computations involving this metric are expensive, especially for large matrices and even more so, in gradient-based algorithms. To alleviate these difficulties, we advocate a novel dissimilarity measure for covariance matrices: the Jensen-Bregman LogDet Divergence. This divergence enjoys several useful theoretical properties, but its greatest benefits are: (i) lower computational costs (compared to standard approaches); and (ii) amenability for use in nearest-neighbor retrieval. We show numerous experiments to substantiate these claims. |
---|---|
ISBN: | 9781457711015 145771101X |
ISSN: | 1550-5499 2380-7504 |
DOI: | 10.1109/ICCV.2011.6126523 |