Dictionary Representation of Deep Features for Occlusion-Robust Face Recognition

Deep learning has achieved exciting results in face recognition; however, the accuracy is still unsatisfying for occluded faces. To improve the robustness for occluded faces, this paper proposes a novel deep dictionary representation-based classification scheme, where a convolutional neural network...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 7; pp. 26595 - 26605
Main Authors Cen, Feng, Wang, Guanghui
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Deep learning has achieved exciting results in face recognition; however, the accuracy is still unsatisfying for occluded faces. To improve the robustness for occluded faces, this paper proposes a novel deep dictionary representation-based classification scheme, where a convolutional neural network is employed as the feature extractor and followed by a dictionary to linearly code the extracted deep features. The dictionary is composed by a gallery part consisting of the deep features of the training samples and an auxiliary part consisting of the mapping vectors acquired from the subjects either inside or outside the training set and associated with the occlusion patterns of the testing face samples. A squared Euclidean norm is used to regularize the coding coefficients. The proposed scheme is computationally efficient and is robust to large contiguous occlusion. In addition, the proposed scheme is generic for both the occluded and non-occluded face images and works with a single training sample per subject. The extensive experimental evaluations demonstrate the superior performance of the proposed approach over other state-of-the-art algorithms.
AbstractList Deep learning has achieved exciting results in face recognition; however, the accuracy is still unsatisfying for occluded faces. To improve the robustness for occluded faces, this paper proposes a novel deep dictionary representation-based classification scheme, where a convolutional neural network is employed as the feature extractor and followed by a dictionary to linearly code the extracted deep features. The dictionary is composed by a gallery part consisting of the deep features of the training samples and an auxiliary part consisting of the mapping vectors acquired from the subjects either inside or outside the training set and associated with the occlusion patterns of the testing face samples. A squared Euclidean norm is used to regularize the coding coefficients. The proposed scheme is computationally efficient and is robust to large contiguous occlusion. In addition, the proposed scheme is generic for both the occluded and non-occluded face images and works with a single training sample per subject. The extensive experimental evaluations demonstrate the superior performance of the proposed approach over other state-of-the-art algorithms.
Author Cen, Feng
Wang, Guanghui
Author_xml – sequence: 1
  givenname: Feng
  orcidid: 0000-0002-0825-385X
  surname: Cen
  fullname: Cen, Feng
  email: feng.cen@tongji.edu.cn
  organization: Department of Control Science and Engineering, College of Electronics and Information Engineering, Tongji University, Shanghai, China
– sequence: 2
  givenname: Guanghui
  orcidid: 0000-0003-3182-104X
  surname: Wang
  fullname: Wang, Guanghui
  organization: Department of Electrical Engineering and Computer Science, The University of Kansas, Lawrence, KS, USA
BookMark eNqFkU9v1DAQxS1UJErpJ-glEucsdvwn9rHadqFSpaIWztZkMq68WuLFTg58e7ykqhAXfLH1Zn5vbL_37GxKEzF2JfhGCO4-XW-3t09Pm44Lt-kcF7I3b9h5J4xrpZbm7K_zO3ZZyp7XZauk-3P29SbiHNME-VfzSMdMhaYZTkqTQnNDdGx2BPNSC01IuXlAPCylltvHNCxlbnaAVElMz1M8YR_Y2wCHQpcv-wX7vrv9tv3S3j98vtte37eouJ3bodfY9UYOfeBKjtaFjoQdZI8ycENGASIRKK201nI0YXAjWdGBHawD6-QFu1t9xwR7f8zxR32CTxD9HyHlZw95jnggL40BMIghOFTWjc6a4AhJjAC6U2P1-rh6HXP6uVCZ_T4tearX912db3gvlKldcu3CnErJFF6nCu5PSfg1CX9Kwr8kUSn3D4Vx_eA5Qzz8h71a2UhEr9Os0UIpJ38DfhmZKw
CODEN IAECCG
CitedBy_id crossref_primary_10_1007_s10462_023_10551_y
crossref_primary_10_1016_j_patcog_2019_107149
crossref_primary_10_1109_ACCESS_2023_3326235
crossref_primary_10_1049_bme2_12029
crossref_primary_10_1109_TCSVT_2020_2978717
crossref_primary_10_1109_TMM_2023_3253054
crossref_primary_10_1109_ACCESS_2019_2912932
crossref_primary_10_3390_electronics10212666
crossref_primary_10_1007_s10489_020_02100_9
crossref_primary_10_1109_ACCESS_2020_3025035
crossref_primary_10_1109_TCYB_2019_2931067
crossref_primary_10_1007_s11063_019_10124_7
crossref_primary_10_1016_j_dsp_2020_102809
crossref_primary_10_1007_s10489_023_05026_0
crossref_primary_10_1007_s00500_022_07289_0
crossref_primary_10_1007_s00138_023_01423_0
crossref_primary_10_3390_s22145284
crossref_primary_10_1371_journal_pone_0236452
crossref_primary_10_1016_j_neucom_2019_06_096
crossref_primary_10_1109_TBIOM_2022_3153391
crossref_primary_10_1109_ACCESS_2021_3089836
crossref_primary_10_1049_bme2_12036
crossref_primary_10_3233_IDA_227309
crossref_primary_10_3390_app11167310
crossref_primary_10_1016_j_compeleceng_2022_108090
crossref_primary_10_1016_j_patcog_2020_107737
crossref_primary_10_1109_ACCESS_2020_3016116
crossref_primary_10_1016_j_neucom_2019_09_045
Cites_doi 10.1109/TCSVT.2018.2829758
10.1109/CVPR.2016.90
10.1145/2733373.2807412
10.1007/s11263-014-0749-x
10.1109/34.908974
10.1016/j.patcog.2013.10.017
10.1109/TIP.2013.2262292
10.1016/j.patcog.2015.02.022
10.1109/CVPR.2017.363
10.1109/TIP.2017.2675341
10.1109/CVPR.2014.244
10.1007/978-3-319-46478-7_31
10.1007/978-3-642-15567-3_33
10.1109/CVPR.2015.7298682
10.1109/ICCV.2011.6126277
10.1109/CVPR.2013.58
10.1109/TIP.2017.2771408
10.1109/CVPR.2017.624
10.1109/TPAMI.2012.30
10.1109/ICCV.2015.425
10.1109/CVPRW.2014.131
10.1109/TCSVT.2017.2654543
10.1109/CVPR.2011.5995556
10.1109/CVPR.2015.7298907
10.1109/CVPR.2015.7298594
10.1109/JSTSP.2007.910971
10.1109/TPAMI.2017.2723009
10.5244/C.29.41
10.1109/TPAMI.2008.79
10.1016/j.patcog.2012.06.022
10.1109/TCSVT.2014.2335851
10.1109/CVPR.2016.529
10.1109/ISSCC.2016.7418007
10.1016/j.patcog.2016.02.016
10.1109/CVPR.2016.322
10.1016/S0262-8856(97)00070-X
10.1109/CVPR.2011.5995566
10.1109/CVPR.2014.220
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019
DBID 97E
ESBDL
RIA
RIE
AAYXX
CITATION
7SC
7SP
7SR
8BQ
8FD
JG9
JQ2
L7M
L~C
L~D
DOA
DOI 10.1109/ACCESS.2019.2901376
DatabaseName IEEE Xplore (IEEE)
IEEE Xplore Open Access (Activated by CARLI)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Computer and Information Systems Abstracts
Electronics & Communications Abstracts
Engineered Materials Abstracts
METADEX
Technology Research Database
Materials Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
Materials Research Database
Engineered Materials Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Advanced Technologies Database with Aerospace
METADEX
Computer and Information Systems Abstracts Professional
DatabaseTitleList Materials Research Database


Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 2169-3536
EndPage 26605
ExternalDocumentID oai_doaj_org_article_366aa6ccff9c489d986f9ece1daa524d
10_1109_ACCESS_2019_2901376
8651449
Genre orig-research
GrantInformation_xml – fundername: China Scholarship Council
  funderid: 10.13039/501100004543
– fundername: Shanghai Agriculture Applied Technology Development Program, China
  grantid: G20180306
GroupedDBID 0R~
4.4
5VS
6IK
97E
AAJGR
ABAZT
ABVLG
ACGFS
ADBBV
AGSQL
ALMA_UNASSIGNED_HOLDINGS
BCNDV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
EJD
ESBDL
GROUPED_DOAJ
IPLJI
JAVBF
KQ8
M43
M~E
O9-
OCL
OK1
RIA
RIE
RNS
AAYXX
CITATION
RIG
7SC
7SP
7SR
8BQ
8FD
JG9
JQ2
L7M
L~C
L~D
ID FETCH-LOGICAL-c408t-b75c2763b7f043d89f2e18b37c3f06e64acceea4545553d6fb9de812a8b89a893
IEDL.DBID RIE
ISSN 2169-3536
IngestDate Wed Aug 27 01:26:45 EDT 2025
Mon Jun 30 03:09:51 EDT 2025
Thu Apr 24 23:01:43 EDT 2025
Tue Jul 01 02:18:11 EDT 2025
Wed Aug 27 02:51:28 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/OAPA.html
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c408t-b75c2763b7f043d89f2e18b37c3f06e64acceea4545553d6fb9de812a8b89a893
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0002-0825-385X
0000-0003-3182-104X
OpenAccessLink https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/document/8651449
PQID 2455607146
PQPubID 4845423
PageCount 11
ParticipantIDs doaj_primary_oai_doaj_org_article_366aa6ccff9c489d986f9ece1daa524d
proquest_journals_2455607146
ieee_primary_8651449
crossref_citationtrail_10_1109_ACCESS_2019_2901376
crossref_primary_10_1109_ACCESS_2019_2901376
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 20190000
2019-00-00
20190101
2019-01-01
PublicationDateYYYYMMDD 2019-01-01
PublicationDate_xml – year: 2019
  text: 20190000
PublicationDecade 2010
PublicationPlace Piscataway
PublicationPlace_xml – name: Piscataway
PublicationTitle IEEE access
PublicationTitleAbbrev Access
PublicationYear 2019
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref35
ref13
ref34
ref12
ref37
ref36
ref14
ref30
sun (ref23) 2015
ref33
sun (ref21) 2014
ref32
martinez (ref15) 1998
ref2
ref1
ref39
ref17
ref38
ref19
ref18
huang (ref4) 2007
wright (ref41) 2011
ref46
ref24
ref45
ref26
ref47
ref25
ref20
ref42
ref22
ref44
ref43
simonyan (ref8) 2014
ref28
ref27
ref29
cao (ref10) 2017
ref7
yi (ref11) 2014
ref9
zhuang (ref31) 2014; 114
ref3
ref6
ref5
ref40
van der maaten (ref16) 2008; 9
References_xml – year: 2011
  ident: ref41
  publication-title: Sparsity and Robustness in Face Recognition
– ident: ref37
  doi: 10.1109/TCSVT.2018.2829758
– ident: ref7
  doi: 10.1109/CVPR.2016.90
– ident: ref44
  doi: 10.1145/2733373.2807412
– volume: 114
  start-page: 272
  year: 2014
  ident: ref31
  article-title: Sparse illumination learning and transfer for single-sample face recognition with image corruption and misalignment
  publication-title: Int J Comput Vis
  doi: 10.1007/s11263-014-0749-x
– ident: ref45
  doi: 10.1109/34.908974
– ident: ref28
  doi: 10.1016/j.patcog.2013.10.017
– ident: ref43
  doi: 10.1109/TIP.2013.2262292
– year: 2015
  ident: ref23
  publication-title: DeepID3 Face Recognition with Very Deep Neural Networks
– ident: ref30
  doi: 10.1016/j.patcog.2015.02.022
– ident: ref12
  doi: 10.1109/CVPR.2017.363
– year: 2007
  ident: ref4
  article-title: Labeled faces in the wild: A database forstudying face recognition in unconstrained environments
– ident: ref33
  doi: 10.1109/TIP.2017.2675341
– ident: ref2
  doi: 10.1109/CVPR.2014.244
– ident: ref25
  doi: 10.1007/978-3-319-46478-7_31
– ident: ref26
  doi: 10.1007/978-3-642-15567-3_33
– ident: ref3
  doi: 10.1109/CVPR.2015.7298682
– ident: ref35
  doi: 10.1109/ICCV.2011.6126277
– ident: ref19
  doi: 10.1109/CVPR.2013.58
– ident: ref39
  doi: 10.1109/TIP.2017.2771408
– ident: ref38
  doi: 10.1109/CVPR.2017.624
– year: 2017
  ident: ref10
  publication-title: VGGFace2 A dataset for recognising faces across pose and age
– ident: ref18
  doi: 10.1109/TPAMI.2012.30
– ident: ref47
  doi: 10.1109/ICCV.2015.425
– ident: ref13
  doi: 10.1109/CVPRW.2014.131
– ident: ref9
  doi: 10.1109/TCSVT.2017.2654543
– ident: ref34
  doi: 10.1109/CVPR.2011.5995556
– ident: ref22
  doi: 10.1109/CVPR.2015.7298907
– start-page: 1988
  year: 2014
  ident: ref21
  article-title: Deep learning face representation by joint identification-verification
  publication-title: Proc Adv Neural Inf Process Syst
– ident: ref6
  doi: 10.1109/CVPR.2015.7298594
– ident: ref42
  doi: 10.1109/JSTSP.2007.910971
– ident: ref14
  doi: 10.1109/TPAMI.2017.2723009
– year: 2014
  ident: ref8
  publication-title: Very Deep Convolutional Networks for Large-scale Image Recognition
– ident: ref40
  doi: 10.5244/C.29.41
– year: 2014
  ident: ref11
  publication-title: Learning face representation from scratch
– ident: ref17
  doi: 10.1109/TPAMI.2008.79
– ident: ref27
  doi: 10.1016/j.patcog.2012.06.022
– ident: ref29
  doi: 10.1109/TCSVT.2014.2335851
– ident: ref24
  doi: 10.1109/CVPR.2016.529
– ident: ref20
  doi: 10.1109/ISSCC.2016.7418007
– ident: ref32
  doi: 10.1016/j.patcog.2016.02.016
– ident: ref36
  doi: 10.1109/CVPR.2016.322
– ident: ref46
  doi: 10.1016/S0262-8856(97)00070-X
– volume: 9
  start-page: 2579
  year: 2008
  ident: ref16
  article-title: Visualizing data using t-SNE
  publication-title: J Mach Learn Res
– year: 1998
  ident: ref15
  article-title: The AR face database
– ident: ref5
  doi: 10.1109/CVPR.2011.5995566
– ident: ref1
  doi: 10.1109/CVPR.2014.220
SSID ssj0000816957
Score 2.3803856
Snippet Deep learning has achieved exciting results in face recognition; however, the accuracy is still unsatisfying for occluded faces. To improve the robustness for...
SourceID doaj
proquest
crossref
ieee
SourceType Open Website
Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 26595
SubjectTerms Algorithms
Artificial neural networks
convolutional neural network
deep learning
Dictionaries
dictionary representation
Face
Face recognition
Feature extraction
Feature recognition
Machine learning
Occlusion
occlusion-robust
Representations
Testing
Training
Visualization
SummonAdditionalLinks – databaseName: DOAJ Directory of Open Access Journals
  dbid: DOA
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Na9wwEBUlp_YQkm5DNl_o0GPdyLYkS8dkt0soJC1LA7kJaSRBIXjDZveQf5-RrF0WAumlV1uyrZnRzJM8ekPIV2aZdoL5irdOVZxDqJyHuhIO4YZDIwg5efz2Tt7c858P4mGn1FfKCRvogQfBXbZSWisBYtTAlfZayagDhNpbKxruk_fFmLezmMo-WNVSi67QDNVMX15NJjiilMulv6d_h21iGdkJRZmxv5RYeeOXc7CZHZD9ghLp1fB1h-RD6D-TTzvcgSPye_o3H0mwyxc6z-ms5RRRTxeRTkN4ognerfEGRWBKfwE8rtPWWDVfuPXzis4sBDrf5A8t-i_kfvbjz-SmKuURKuBMrSrXCWjQPbguMt56pWMTauXaDtrIZJDcAkZAyxEjCdF6GZ32AeO5VU5pizjliOz1iz4cE6prL0LXsQawW5B1mptdAyB4VKgxNSbNRlIGCnd4KmHxaPIagmkziNck8Zoi3jH5tu30NFBnvN_8Oqlg2zTxXucLaA2mWIP5lzWMySgpcPsQJREQcj0mZxuFmjJHn02Dcknselye_I9Xn5KPaTjD9swZ2Vst1-EcAcvKXWTbfAWWy-bU
  priority: 102
  providerName: Directory of Open Access Journals
Title Dictionary Representation of Deep Features for Occlusion-Robust Face Recognition
URI https://ieeexplore.ieee.org/document/8651449
https://www.proquest.com/docview/2455607146
https://doaj.org/article/366aa6ccff9c489d986f9ece1daa524d
Volume 7
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwELbanuDAqyCWlsoHjs02D9uxj-2WVYVUQCsq9WbZ47GEqJKqTQ7l12M73qgChLhFiR05_saZ8XjmG0I-lKZUlpeuYI2VBWOAhXVQFdwGc8MGIcAUPH75WVxcsU_X_HqHHM-5MIiYgs9wGS_TWb7rYYyushMpgnpnapfsho3blKs1-1NiAQnF20wsVJXq5HS1Ct8Qo7fUMp4WNpFX5JHySRz9uajKH3_ipF7Wz8nldmBTVMmP5TjYJfz8jbPxf0f-gjzLdiY9nQTjJdnB7hV5-oh9cJ98Pf-ekhrM3QPdpIDYnIfU0d7Tc8RbGg3EMTygwbSlXwBuxuhcKza9He8HujaAdLONQOq71-Rq_fHb6qLIBRYKYKUcCttyqMMPxra-ZI2TytdYSdu00PhSoGAGgg41LFhZnDdOeKscBovASCuVCZbOG7LX9R2-JVRVjmPbljWEbiiquLrbGoAzLwPmckHq7cxryOzjsQjGjU67kFLpCS4d4dIZrgU5njvdTuQb_25-FiGdm0bm7HQjQKHzQtSNEMYIAO8VMKmcksIrBKycMbxmbkH2I3zzSzJyC3K4FRCdV_m9rsO8RH4-Jt79vdcBeRIHOLlsDsnecDfi-2DEDPYobf6Pkgz_AtEo75U
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB6VcgAOvApioYAPHJttHrZjH8uW1QLdglat1Jvlx0RCVEnVJgf49diON6oAIW5RYkeOv7FnMp75BuBdrnNpWO4yWhmRUWoxM84WGTPe3DBeCDAGj69P-eqcfrpgFztwMOXCIGIMPsN5uIxn-a6zQ3CVHQru1TuVd-Cu1_usGLO1Jo9KKCEhWZ2ohYpcHh4tFv4rQvyWnIfzwiowi9xSP5GlP5VV-WMvjgpm-QjW26GNcSXf50Nv5vbnb6yN_zv2x_AwWZrkaBSNJ7CD7VN4cIt_cA--Hn-LaQ36-gfZxJDYlInUkq4hx4hXJJiIg39AvHFLvlh7OQT3WrbpzHDTk6W2SDbbGKSufQbnyw9ni1WWSixkluaiz0zNbOm3GFM3Oa2ckE2JhTBVbasm58iptl6Lahrmm1WON0Y69DaBFkZI7W2d57Dbdi2-ACILx7Cu89L6bsiLsL7r0lpGG-FRFzMotzOvbOIfD2UwLlX8D8mlGuFSAS6V4JrBwdTpaqTf-Hfz9wHSqWngzo43PBQqLUVVca41t7ZppKVCOil4I9Fi4bRmJXUz2AvwTS9JyM1gfysgKq3zG1X6eQkMfZS__Huvt3BvdbY-UScfTz-_gvthsKMDZx92--sBX3uTpjdvoiT_Asze8ek
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Dictionary+Representation+of+Deep+Features+for+Occlusion-Robust+Face+Recognition&rft.jtitle=IEEE+access&rft.au=Cen%2C+Feng&rft.au=Wang%2C+Guanghui&rft.date=2019&rft.pub=IEEE&rft.eissn=2169-3536&rft.volume=7&rft.spage=26595&rft.epage=26605&rft_id=info:doi/10.1109%2FACCESS.2019.2901376&rft.externalDocID=8651449
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2169-3536&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2169-3536&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2169-3536&client=summon