PML: Progressive Margin Loss for Long-tailed Age Classification

In this paper, we propose a progressive margin loss (PML) approach for unconstrained facial age classification. Conventional methods make strong assumption on that each class owns adequate instances to outline its data distribution, likely leading to bias prediction where the training samples are sp...

Full description

Saved in:
Bibliographic Details
Published in2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) pp. 10498 - 10507
Main Authors Deng, Zongyong, Liu, Hao, Wang, Yaoxing, Wang, Chenyang, Yu, Zekuan, Sun, Xuehong
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.06.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this paper, we propose a progressive margin loss (PML) approach for unconstrained facial age classification. Conventional methods make strong assumption on that each class owns adequate instances to outline its data distribution, likely leading to bias prediction where the training samples are sparse across age classes. Instead, our PML aims to adaptively refine the age label pattern by enforcing a couple of margins, which fully takes in the in-between discrepancy of the intra-class variance, inter-class variance and class center. Our PML typically incorporates with the ordinal margin and the variational margin, simultaneously plugging in the globally-tuned deep neural network paradigm. More specifically, the ordinal margin learns to exploit the correlated relationship of the real-world age labels. Accordingly, the variational margin is leveraged to minimize the influence of head classes that misleads the prediction of tailed samples. Moreover, our optimization carefully seeks a series of indicator curricula to achieve robust and efficient model training. Extensive experimental results on three face aging datasets demonstrate that our PML achieves compelling performance compared to state of the art. Code will be made publicly.
ISSN:2575-7075
DOI:10.1109/CVPR46437.2021.01036