Dep-FER: Facial Expression Recognition in Depressed Patients Based on Voluntary Facial Expression Mimicry
Facial expressions are important nonverbal behaviors that humans use to express their feelings. Clinical research have shown that depressed patients have poor facial expressiveness and mimicry. As a result, we propose a VFEM experiment with seven expressions to explore variations in facial expressio...
Saved in:
Published in | IEEE transactions on affective computing Vol. 15; no. 3; pp. 1725 - 1738 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
01.07.2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Facial expressions are important nonverbal behaviors that humans use to express their feelings. Clinical research have shown that depressed patients have poor facial expressiveness and mimicry. As a result, we propose a VFEM experiment with seven expressions to explore variations in facial expression features between depressed patients and normal people, including anger, disgust, fear, happiness, neutrality, sadness, and surprise. It has been discovered through VFEM experiments that depressed patients frequently exhibit negative facial expressions. Meanwhile, we propose a depression facial expression recognition (Dep-FER) model in this research. Dep-FER involves three innovative and crucial components: Mask Multi-head Self-Attention (MMSA), facial action unit similarity loss function (AUs Loss), and case-control loss function (CC Loss). MMSA can filter out disturbing samples and force to learn the relationship between different samples. AUs Loss utilizes the similarity between each expression AU and the model output to improve the generalization ability of the model. CC Loss addresses the intrinsic link between the depressed and normal patient categories. Dep-FER achieves excellent performance in VFEM and outperforms existing comparative models. |
---|---|
AbstractList | Facial expressions are important nonverbal behaviors that humans use to express their feelings. Clinical research have shown that depressed patients have poor facial expressiveness and mimicry. As a result, we propose a VFEM experiment with seven expressions to explore variations in facial expression features between depressed patients and normal people, including anger, disgust, fear, happiness, neutrality, sadness, and surprise. It has been discovered through VFEM experiments that depressed patients frequently exhibit negative facial expressions. Meanwhile, we propose a depression facial expression recognition (Dep-FER) model in this research. Dep-FER involves three innovative and crucial components: Mask Multi-head Self-Attention (MMSA), facial action unit similarity loss function (AUs Loss), and case-control loss function (CC Loss). MMSA can filter out disturbing samples and force to learn the relationship between different samples. AUs Loss utilizes the similarity between each expression AU and the model output to improve the generalization ability of the model. CC Loss addresses the intrinsic link between the depressed and normal patient categories. Dep-FER achieves excellent performance in VFEM and outperforms existing comparative models. |
Author | Zheng, Yunshao Liu, Yang Ye, Jiayu Wang, Qingxiang Yu, Yanhong |
Author_xml | – sequence: 1 givenname: Jiayu orcidid: 0000-0003-0368-9651 surname: Ye fullname: Ye, Jiayu email: yejiayu97@outlook.com organization: Key Laboratory of Computing Power Network and Information Security, Ministry of Education, Qilu University of Technology (Shandong Academy of Sciences), Jinan, China – sequence: 2 givenname: Yanhong orcidid: 0000-0002-6547-6320 surname: Yu fullname: Yu, Yanhong organization: Shandong University of Traditional Chinese Medicine, Jinan, China – sequence: 3 givenname: Yunshao orcidid: 0009-0008-0076-2968 surname: Zheng fullname: Zheng, Yunshao organization: Shandong Mental Health Center, Jinan, China – sequence: 4 givenname: Yang orcidid: 0009-0005-0759-2728 surname: Liu fullname: Liu, Yang organization: Guangzhou University of Traditional Chinese Medicine, Guangzhou, China – sequence: 5 givenname: Qingxiang orcidid: 0000-0002-8159-7739 surname: Wang fullname: Wang, Qingxiang email: wangqx@qlu.edu.cn organization: Key Laboratory of Computing Power Network and Information Security, Ministry of Education, Qilu University of Technology (Shandong Academy of Sciences), Jinan, China |
BookMark | eNp9kFFLwzAQx4NMcM59AfGh4HNnkkvaxrc5VxUmypi-lixLJaNLa9KB-_ambg9DxHu5C_f_Xe7-56hna6sRuiR4RAgWN4txnk9GFFM2AkgxwXCC-kQwEQNmvHdUn6Gh92scAgASmvaRuddNnE_nt1EulZFVNP1qnPbe1Daaa1V_WNN2tbFRUHYdvYpeZWu0bX10J7tnaL_X1da20u3-GPNsNka53QU6LWXl9fCQB-gtny4mj_Hs5eFpMp7FioqkjRPJaMppwnhIgkO5LCXVSixFkmQ4yYAJLvCKE7XKIKPAOZVMM0WzMk0ExTBA1_u5jas_t9q3xbreOhu-LIBgAsA4ZEGV7VXK1d47XRbKtLI7tXXSVAXBRedt8eNt0XlbHLwNKP2FNs5swu3_Q1d7yGitjwDGRNgGvgHoNYUw |
CODEN | ITACBQ |
CitedBy_id | crossref_primary_10_3390_computers14010029 |
Cites_doi | 10.1016/j.imavis.2008.08.005 10.1109/ISBI45749.2020.9098396 10.1016/j.neucli.2014.03.003 10.1145/3133944.3133953 10.1017/S0033291709990948 10.1111/j.0963-7214.2005.00354.x 10.1109/ICCV.2017.74 10.1037/h0030377 10.3758/brm.40.1.109 10.3389/fnins.2021.609760 10.1609/aaai.v34i04.6113 10.1007/978-3-030-01249-6_50 10.1109/TIFS.2019.2946938 10.1145/2988257.2988258 10.3390/s21093046 10.1109/TAFFC.2017.2724035 10.1016/j.pnpbp.2010.04.011 10.1145/2512530.2512533 10.1016/j.jad.2021.08.090 10.1109/TAFFC.2021.3122146 10.1109/WACV.2016.7477553 10.1145/3347320.3357688 10.1109/CVPRW.2010.5543262 10.1109/TMM.2018.2844085 10.1109/AFGR.1998.670949 10.1016/j.neuroimage.2005.02.048 10.1097/PSY.0b013e3181a2515c 10.1037/0021-843X.116.1.80 10.1109/TBME.2010.2048568 10.1109/CVPR42600.2020.00693 10.1145/2661806.2661807 10.1177/1534582302001001003 10.1109/FG.2018.00019 10.1037/0894-4105.18.2.212 10.1109/ICCV48922.2021.00358 10.1109/CVPR.2018.00745 10.1007/978-3-031-15934-3_10 10.1109/ACCESS.2022.3156598 10.1073/pnas.1322355111 10.1109/BIBM49941.2020.9313597 10.1093/nar/gkg509 10.1109/DSAA.2019.00082 10.1176/appi.focus.140208 10.1016/0165-1781(94)90032-9 10.1016/j.jad.2011.03.049 10.1109/CVPR.2016.90 10.1109/TIP.2019.2956143 10.1007/978-3-642-42051-1_16 10.1109/WACV48630.2021.00245 10.1007/BF02169077 10.1007/BF02190342 10.1007/s41870-023-01184-z 10.1007/978-90-481-3987-3_14 10.1109/CVPR42600.2020.01155 10.1109/ICCVW60793.2023.00339 10.1109/CVPR52688.2022.01167 10.1007/978-3-319-24574-4_28 10.1016/j.cpr.2007.10.001 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024 |
DBID | 97E RIA RIE AAYXX CITATION 7SC 8FD JQ2 L7M L~C L~D |
DOI | 10.1109/TAFFC.2024.3370103 |
DatabaseName | IEEE Xplore (IEEE) IEEE All-Society Periodicals Package (ASPP) 1998-Present IEEE Electronic Library (IEL) CrossRef Computer and Information Systems Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional |
DatabaseTitle | CrossRef Computer and Information Systems Abstracts Technology Research Database Computer and Information Systems Abstracts – Academic Advanced Technologies Database with Aerospace ProQuest Computer Science Collection Computer and Information Systems Abstracts Professional |
DatabaseTitleList | Computer and Information Systems Abstracts |
Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Computer Science |
EISSN | 1949-3045 |
EndPage | 1738 |
ExternalDocumentID | 10_1109_TAFFC_2024_3370103 10449383 |
Genre | orig-research |
GrantInformation_xml | – fundername: Key Technology Research and Development Program of Shandong Province; Key Research and Development Program of Shandong Province grantid: 2020CXGC010901; 2021SFGC0504 funderid: 10.13039/100014103 – fundername: Talent Training and Promotion Plan of Qilu University of Technology grantid: Shandong Academy of Sciences; 2021PY06007 – fundername: Natural Science Foundation of Shandong Province; Shandong Provincial Natural Science Foundation grantid: ZR2021MF079 funderid: 10.13039/501100007129 – fundername: National Natural Science Foundation of China grantid: 81573829 funderid: 10.13039/501100001809 |
GroupedDBID | 0R~ 4.4 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABJNI ABQJQ ABVLG AENEX AGQYO AGSQL AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ EBS EJD HZ~ IEDLZ IFIPE IPLJI JAVBF M43 O9- OCL PQQKQ RIA RIE RNI RZB AAYXX CITATION RIG 7SC 8FD JQ2 L7M L~C L~D |
ID | FETCH-LOGICAL-c296t-6a42752645275953fbfa2ec9b9668068349590d51cd83823552a4e4c28f769203 |
IEDL.DBID | RIE |
ISSN | 1949-3045 |
IngestDate | Mon Jun 30 16:34:05 EDT 2025 Tue Jul 01 02:57:55 EDT 2025 Thu Apr 24 22:56:35 EDT 2025 Wed Aug 27 02:01:56 EDT 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 3 |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c296t-6a42752645275953fbfa2ec9b9668068349590d51cd83823552a4e4c28f769203 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ORCID | 0000-0002-6547-6320 0009-0008-0076-2968 0009-0005-0759-2728 0000-0003-0368-9651 0000-0002-8159-7739 |
PQID | 3101334538 |
PQPubID | 2040414 |
PageCount | 14 |
ParticipantIDs | crossref_citationtrail_10_1109_TAFFC_2024_3370103 proquest_journals_3101334538 ieee_primary_10449383 crossref_primary_10_1109_TAFFC_2024_3370103 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2024-07-01 |
PublicationDateYYYYMMDD | 2024-07-01 |
PublicationDate_xml | – month: 07 year: 2024 text: 2024-07-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | Piscataway |
PublicationPlace_xml | – name: Piscataway |
PublicationTitle | IEEE transactions on affective computing |
PublicationTitleAbbrev | TAFFC |
PublicationYear | 2024 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref13 ref57 ref12 ref56 ref15 ref59 ref14 ref58 ref53 ref11 Srivastava (ref52) 2014; 15 ref55 ref10 ref54 ref17 ref16 ref19 ref18 ref51 ref50 ref46 ref45 ref47 ref42 ref41 ref44 ref43 (ref1) 2017 ref49 ref8 ref7 ref9 ref4 ref3 ref6 ref5 ref40 ref35 ref34 Dosovitskiy (ref48) 2020 ref36 ref31 ref30 ref33 ref32 ref2 Simonyan (ref62) 2014 ref39 ref38 ref24 American (ref37) 2013; 5 ref23 ref26 ref25 ref64 ref63 ref22 ref21 ref28 ref27 ref29 Pourmirzaei (ref20) 2021 ref60 ref61 |
References_xml | – ident: ref27 doi: 10.1016/j.imavis.2008.08.005 – ident: ref25 doi: 10.1109/ISBI45749.2020.9098396 – ident: ref45 doi: 10.1016/j.neucli.2014.03.003 – ident: ref41 doi: 10.1145/3133944.3133953 – volume: 5 volume-title: Diagnostic and Statistical Manual of Mental Disorders: DSM-5 year: 2013 ident: ref37 – ident: ref13 doi: 10.1017/S0033291709990948 – ident: ref47 doi: 10.1111/j.0963-7214.2005.00354.x – ident: ref63 doi: 10.1109/ICCV.2017.74 – ident: ref16 doi: 10.1037/h0030377 – ident: ref54 doi: 10.3758/brm.40.1.109 – ident: ref26 doi: 10.3389/fnins.2021.609760 – ident: ref50 doi: 10.1609/aaai.v34i04.6113 – ident: ref53 doi: 10.1007/978-3-030-01249-6_50 – ident: ref30 doi: 10.1109/TIFS.2019.2946938 – ident: ref40 doi: 10.1145/2988257.2988258 – ident: ref55 doi: 10.3390/s21093046 – year: 2020 ident: ref48 article-title: An image is worth 16x16 words: Transformers for image recognition at scale – ident: ref9 doi: 10.1109/TAFFC.2017.2724035 – ident: ref5 doi: 10.1016/j.pnpbp.2010.04.011 – ident: ref38 doi: 10.1145/2512530.2512533 – ident: ref8 doi: 10.1016/j.jad.2021.08.090 – ident: ref33 doi: 10.1109/TAFFC.2021.3122146 – ident: ref42 doi: 10.1109/WACV.2016.7477553 – ident: ref23 doi: 10.1145/3347320.3357688 – ident: ref18 doi: 10.1109/CVPRW.2010.5543262 – ident: ref32 doi: 10.1109/TMM.2018.2844085 – ident: ref64 doi: 10.1109/AFGR.1998.670949 – ident: ref3 doi: 10.1016/j.neuroimage.2005.02.048 – ident: ref7 doi: 10.1097/PSY.0b013e3181a2515c – ident: ref12 doi: 10.1037/0021-843X.116.1.80 – ident: ref2 doi: 10.1109/TBME.2010.2048568 – ident: ref34 doi: 10.1109/CVPR42600.2020.00693 – ident: ref39 doi: 10.1145/2661806.2661807 – ident: ref44 doi: 10.1177/1534582302001001003 – year: 2014 ident: ref62 article-title: Very deep convolutional networks for large-scale image recognition – year: 2021 ident: ref20 article-title: Using self-supervised auxiliary tasks to improve fine-grained facial representation – ident: ref29 doi: 10.1109/FG.2018.00019 – ident: ref14 doi: 10.1037/0894-4105.18.2.212 – ident: ref31 doi: 10.1109/ICCV48922.2021.00358 – ident: ref49 doi: 10.1109/CVPR.2018.00745 – ident: ref60 doi: 10.1007/978-3-031-15934-3_10 – ident: ref19 doi: 10.1109/ACCESS.2022.3156598 – ident: ref43 doi: 10.1073/pnas.1322355111 – year: 2017 ident: ref1 article-title: Depression and other common mental disorders: Global health estimates – ident: ref24 doi: 10.1109/BIBM49941.2020.9313597 – ident: ref28 doi: 10.1093/nar/gkg509 – ident: ref6 doi: 10.1109/DSAA.2019.00082 – ident: ref11 doi: 10.1176/appi.focus.140208 – ident: ref15 doi: 10.1016/0165-1781(94)90032-9 – ident: ref4 doi: 10.1016/j.jad.2011.03.049 – ident: ref56 doi: 10.1109/CVPR.2016.90 – ident: ref35 doi: 10.1109/TIP.2019.2956143 – ident: ref17 doi: 10.1007/978-3-642-42051-1_16 – ident: ref36 doi: 10.1109/WACV48630.2021.00245 – ident: ref22 doi: 10.1007/BF02169077 – ident: ref21 doi: 10.1007/BF02190342 – ident: ref58 doi: 10.1007/s41870-023-01184-z – ident: ref10 doi: 10.1007/978-90-481-3987-3_14 – ident: ref51 doi: 10.1109/CVPR42600.2020.01155 – ident: ref59 doi: 10.1109/ICCVW60793.2023.00339 – ident: ref57 doi: 10.1109/CVPR52688.2022.01167 – ident: ref61 doi: 10.1007/978-3-319-24574-4_28 – ident: ref46 doi: 10.1016/j.cpr.2007.10.001 – volume: 15 start-page: 1929 issue: 1 year: 2014 ident: ref52 article-title: Dropout: A simple way to prevent neural networks from overfitting publication-title: J. Mach. Learn. Res. |
SSID | ssj0000333627 |
Score | 2.3435607 |
Snippet | Facial expressions are important nonverbal behaviors that humans use to express their feelings. Clinical research have shown that depressed patients have poor... |
SourceID | proquest crossref ieee |
SourceType | Aggregation Database Enrichment Source Index Database Publisher |
StartPage | 1725 |
SubjectTerms | Affective computing Deep learning dep -FER Depression Face recognition Faces Feature extraction FER Mimicry Similarity Task analysis Uncertainty |
Title | Dep-FER: Facial Expression Recognition in Depressed Patients Based on Voluntary Facial Expression Mimicry |
URI | https://ieeexplore.ieee.org/document/10449383 https://www.proquest.com/docview/3101334538 |
Volume | 15 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1JSwMxFA7akxfrinUjB28ydSbLTOOt2g4itEhR6W3INlDUaWmnoP56X2YRd7xlmCQEvry89yVvQegkYoAqiYxnqQ09RrT1QM-AXDEeSS2oUdIFJw-G4dUdux7zcRWsXsTCWGsL5zPbds3iLd9M9dJdlYGEMyaAUq2iVWBuZbDW-4WKTykcxlEdGOOLs9tuHF8CBSSsTWnkChp8Uj5FNZVvR3ChV-ImGtYrKt1JHtrLXLX165dkjf9e8gZaryxM3C23xCZasdkWatbVG3AlzNto0rMzL-6PznEs3b057j9XTrEZHtVuRdCeZLhXustag2_KNKwLfCHdJ_y-h72b5XL-8sM0g8nTRM9fdtBd3L-9vPKqwgueJiLMvVAyEnHi3jwjLjhNVSqJ1UIBN-r4YYcCqxK-4YE2HfeOyDmRzDJNOmkUCuLTXdTIppndQ1ipwJg0BStAcJYqJU2oAxrQ0ABRYoFooaBGJNFVVnJXHOMxKdiJL5ICxcShmFQottDp-5hZmZPjz947DpYPPUtEWuiwRj6p5HaRgLELpJ2BFtj_ZdgBWnOzlx67h6iRz5f2COySXB0X-_EN7bTdhw |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1JT9wwFH6icCiXsospUHzoDWWaeEnG3FgmGigzQmiouEXeIo3ahtGQkQq_nucsCApFvTmKnVj6_PzeZ78F4GvCEVWa2MAxFwecGhegnkG54iJRRjKrlQ9OHo7iwTU_vxE3TbB6FQvjnKucz1zXN6u7fHtr5v6oDCWcc4mU6gMsoeIXUR2u9XSkEjKG23HShsaE8tv4KE1PkARS3mUs8SUNXqifqp7Kq0240izpCozaOdUOJT-781J3zcNf6Rr_e9Kr8KmxMclRvSjWYMEV67DS1m8gjThvwOTUTYO0f3VIUuVPzkn_T-MWW5Cr1rEI25OCnNYOs86SyzoR6x05Vv4RX__A1VuUanb_xmeGk98TM7vfhOu0Pz4ZBE3phcBQGZdBrDhNBPW3nomQguU6V9QZqZEd9cK4x5BXydCKyNiev0kUgiruuKG9PIklDdkWLBa3hdsGonVkbZ6jHSAFz7VWNjYRi1hskSrxSHYgahHJTJOX3JfH-JVV_CSUWYVi5lHMGhQ7cPA0Zlpn5Xi396aH5VnPGpEO7LbIZ43k3mVo7iJt56gHPv9j2D58HIyHF9nF2ej7Diz7P9X-u7uwWM7mbg-tlFJ_qdbmI6WO4NA |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Dep-FER%3A+Facial+Expression+Recognition+in+Depressed+Patients+Based+on+Voluntary+Facial+Expression+Mimicry&rft.jtitle=IEEE+transactions+on+affective+computing&rft.au=Ye%2C+Jiayu&rft.au=Yu%2C+Yanhong&rft.au=Zheng%2C+Yunshao&rft.au=Liu%2C+Yang&rft.date=2024-07-01&rft.pub=IEEE&rft.eissn=1949-3045&rft.volume=15&rft.issue=3&rft.spage=1725&rft.epage=1738&rft_id=info:doi/10.1109%2FTAFFC.2024.3370103&rft.externalDocID=10449383 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1949-3045&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1949-3045&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1949-3045&client=summon |