Efficient similarity search for covariance matrices via the Jensen-Bregman LogDet Divergence

Covariance matrices provide compact, informative feature descriptors for use in several computer vision applications, such as people-appearance tracking, diffusion-tensor imaging, activity recognition, among others. A key task in many of these applications is to compare different covariance matrices...

Full description

Saved in:
Bibliographic Details
Published in2011 International Conference on Computer Vision pp. 2399 - 2406
Main Authors Cherian, A., Sra, S., Banerjee, A., Papanikolopoulos, N.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.11.2011
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Covariance matrices provide compact, informative feature descriptors for use in several computer vision applications, such as people-appearance tracking, diffusion-tensor imaging, activity recognition, among others. A key task in many of these applications is to compare different covariance matrices using a (dis)similarity function. A natural choice here is the Riemannian metric corresponding to the manifold inhabited by covariance matrices. But computations involving this metric are expensive, especially for large matrices and even more so, in gradient-based algorithms. To alleviate these difficulties, we advocate a novel dissimilarity measure for covariance matrices: the Jensen-Bregman LogDet Divergence. This divergence enjoys several useful theoretical properties, but its greatest benefits are: (i) lower computational costs (compared to standard approaches); and (ii) amenability for use in nearest-neighbor retrieval. We show numerous experiments to substantiate these claims.
ISBN:9781457711015
145771101X
ISSN:1550-5499
2380-7504
DOI:10.1109/ICCV.2011.6126523