Super-Sparse Regression for Fast Age Estimation from Faces at Test Time

Age estimation from faces is a challenging problem that has recently gained increasing relevance due to its potentially multi-faceted applications. Many current methods for age estimation rely on extracting computationally-demanding features from face images, and then use nonlinear regression to est...

Full description

Saved in:
Bibliographic Details
Published inImage Analysis and Processing -- ICIAP 2015 Vol. 9280; pp. 551 - 562
Main Authors Demontis, Ambra, Biggio, Battista, Fumera, Giorgio, Roli, Fabio
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 2015
Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Age estimation from faces is a challenging problem that has recently gained increasing relevance due to its potentially multi-faceted applications. Many current methods for age estimation rely on extracting computationally-demanding features from face images, and then use nonlinear regression to estimate the subject’s age. This often requires matching the submitted face image against a set of face prototypes, potentially including all training face images, as in the case of kernel-based methods. In this work, we propose a super-sparse regression technique that can reach comparable performance with respect to other nonlinear regression techniques, while drastically reducing the number of reference prototypes required for age estimation. Given a similarity measure between faces, our technique learns a sparse set of virtual face prototypes, whose number is fixed a priori, along with a set of optimal weight coefficients to perform linear regression in the space induced by the similarity measure. We show that our technique does not only drastically reduce the number of reference prototypes without compromising estimation accuracy, but it can also provide more interpretable decisions.
ISBN:3319232339
9783319232331
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-23234-8_51