Double Additive Margin Softmax Loss for Face Recognition

Learning large-margin face features whose intra-class variance is small and inter-class diversity is one of important challenges in feature learning applying Deep Convolutional Neural Networks (DCNNs) for face recognition. Recently, an appealing line of research is to incorporate an angular margin i...

Full description

Saved in:
Bibliographic Details
Published inApplied sciences Vol. 10; no. 1; p. 60
Main Authors Zhou, Shengwei, Chen, Caikou, Han, Guojiang, Hou, Xielian
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.01.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Learning large-margin face features whose intra-class variance is small and inter-class diversity is one of important challenges in feature learning applying Deep Convolutional Neural Networks (DCNNs) for face recognition. Recently, an appealing line of research is to incorporate an angular margin in the original softmax loss functions for obtaining discriminative deep features during the training of DCNNs. In this paper we propose a novel loss function, termed as double additive margin Softmax loss (DAM-Softmax). The presented loss has a clearer geometrical explanation and can obtain highly discriminative features for face recognition. Extensive experimental evaluation of several recent state-of-the-art softmax loss functions are conducted on the relevant face recognition benchmarks, CASIA-Webface, LFW, CALFW, CPLFW, and CFP-FP. We show that the proposed loss function consistently outperforms the state-of-the-art.
ISSN:2076-3417
2076-3417
DOI:10.3390/app10010060