Implicit elastic matching with random projections for pose-variant face recognition

We present a new approach to robust pose-variant face recognition, which exhibits excellent generalization ability even across completely different datasets due to its weak dependence on data. Most face recognition algorithms assume that the face images are very well-aligned. This assumption is ofte...

Full description

Saved in:
Bibliographic Details
Published in2009 IEEE Conference on Computer Vision and Pattern Recognition pp. 1502 - 1509
Main Authors Wright, John, Gang Hua
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.06.2009
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We present a new approach to robust pose-variant face recognition, which exhibits excellent generalization ability even across completely different datasets due to its weak dependence on data. Most face recognition algorithms assume that the face images are very well-aligned. This assumption is often violated in real-life face recognition tasks, in which face detection and rectification have to be performed automatically prior to recognition. Although great improvements have been made in face alignment recently, significant pose variations may still occur in the aligned faces. We propose a multiscale local descriptor-based face representation to mitigate this issue. First, discriminative local image descriptors are extracted from a dense set of multiscale image patches. The descriptors are expanded by their spatial locations. Each expanded descriptor is quantized by a set of random projection trees. The final face representation is a histogram of the quantized descriptors. The location expansion constrains the quantization regions to be localized not just in feature space but also in image space, allowing us to achieve an implicit elastic matching for face images. Our experiments on challenging face recognition benchmarks demonstrate the advantages of the proposed approach for handling large pose variations, as well as its superb generalization ability.
ISBN:1424439922
9781424439928
ISSN:1063-6919
1063-6919
DOI:10.1109/CVPR.2009.5206786