SFF-DA: Spatiotemporal Feature Fusion for Nonintrusively Detecting Anxiety
The early detection of anxiety disorders is crucial in mitigating distress and enhancing outcomes for individuals with mental disorders. Deep learning methods and traditional machine learning approaches are both used for the early screening of mental disorders, particularly those with anxiety sympto...
Saved in:
Published in | IEEE transactions on instrumentation and measurement Vol. 73; p. 1 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
01.01.2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | The early detection of anxiety disorders is crucial in mitigating distress and enhancing outcomes for individuals with mental disorders. Deep learning methods and traditional machine learning approaches are both used for the early screening of mental disorders, particularly those with anxiety symptoms. These methods excel at extracting spatiotemporal features associated with mental disorders; however, they often overlook potential interrelationships among these features. Furthermore, the effectiveness of existing methods is hindered by disparities in the quality of subject data collected in nonlaboratory settings, limited data sample sizes, and other factors. Therefore, we propose a non-intrusive anxiety detection framework based on spatiotemporal feature fusion. Within this framework, spatiotemporal features are extracted from physiological and behavioural data through a shared feature extraction network. Additionally, we design a few-shot learning architecture to compute the coupling of fused spatiotemporal features, assessing the similarity of various feature types within sample pairs. Furthermore, joint training strategies applied within the framework significantly enhance the performance of classification performance. We validate the performance of our framework through experiments with a real-world seafarer dataset. The experimental results unequivocally demonstrate that our framework outperforms comparative approaches. |
---|---|
AbstractList | The early detection of anxiety disorders is crucial in mitigating distress and enhancing outcomes for individuals with mental disorders. Deep learning methods and traditional machine learning approaches are both used for the early screening of mental disorders, particularly those with anxiety symptoms. These methods excel at extracting spatiotemporal features associated with mental disorders; however, they often overlook potential interrelationships among these features. Furthermore, the effectiveness of existing methods is hindered by disparities in the quality of subject data collected in nonlaboratory settings, limited data sample sizes, and other factors. Therefore, we propose a non-intrusive anxiety detection framework based on spatiotemporal feature fusion. Within this framework, spatiotemporal features are extracted from physiological and behavioural data through a shared feature extraction network. Additionally, we design a few-shot learning architecture to compute the coupling of fused spatiotemporal features, assessing the similarity of various feature types within sample pairs. Furthermore, joint training strategies applied within the framework significantly enhance the performance of classification performance. We validate the performance of our framework through experiments with a real-world seafarer dataset. The experimental results unequivocally demonstrate that our framework outperforms comparative approaches. The early detection of anxiety disorders is crucial in mitigating distress and enhancing outcomes for individuals with mental disorders. Deep learning methods and traditional machine learning approaches are both used for the early screening of mental disorders, particularly those with anxiety symptoms. These methods excel at extracting spatiotemporal features associated with mental disorders; however, they often overlook potential interrelationships among these features. Furthermore, the effectiveness of the existing methods is hindered by disparities in the quality of subject data collected in nonlaboratory settings, limited data sample sizes, and other factors. Therefore, we propose a nonintrusive anxiety detection framework based on spatiotemporal feature fusion. Within this framework, spatiotemporal features are extracted from physiological and behavioral data through a shared feature extraction network. In addition, we design a few-shot learning architecture to compute the coupling of fused spatiotemporal features, assessing the similarity of various feature types within sample pairs. Furthermore, joint training strategies applied within the framework significantly enhance the performance of classification performance. We validate the performance of our framework through experiments with a real-world seafarer dataset. The experimental results unequivocally demonstrate that our framework outperforms comparative approaches. |
Author | Mo, Haimiao Liao, Xiao Ding, Shuai Li, Yuchen Han, Peng Zhang, Wei |
Author_xml | – sequence: 1 givenname: Haimiao orcidid: 0000-0001-6725-6703 surname: Mo fullname: Mo, Haimiao organization: School of Management, Hefei University of Technology, Hefei, China – sequence: 2 givenname: Yuchen orcidid: 0000-0002-8825-0527 surname: Li fullname: Li, Yuchen organization: Mental Health Center and West China Biomedical Big Data Center, West China Hospital, Sichuan University, Chengdu, China – sequence: 3 givenname: Peng orcidid: 0009-0008-5921-9062 surname: Han fullname: Han, Peng organization: School of Management, Hefei University of Technology, Hefei, China – sequence: 4 givenname: Xiao orcidid: 0009-0009-8779-490X surname: Liao fullname: Liao, Xiao organization: Mental Health Center and West China Biomedical Big Data Center, West China Hospital, Sichuan University, Chengdu, China – sequence: 5 givenname: Wei orcidid: 0000-0003-3113-9577 surname: Zhang fullname: Zhang, Wei organization: Mental Health Center and West China Biomedical Big Data Center, West China Hospital, Sichuan University, Chengdu, China – sequence: 6 givenname: Shuai orcidid: 0000-0002-8384-1950 surname: Ding fullname: Ding, Shuai organization: School of Management, Hefei University of Technology, Hefei, China |
BookMark | eNp9kM1LAzEQxYNUsK3ePXhY8Lw137vxVlrXD6oeWs8hpomkbDdrNiv2vzelPYgHYZiB4b15zG8EBo1vDACXCE4QguJm9fg8wRCTCSEUIYJPwBAxVuSCczwAQwhRmQvK-BkYdd0GQlhwWgzB07Kq8vn0Nlu2Kjofzbb1QdVZZVTsg8mqvnO-yawP2YtvXBNDWnyZepfNTTQ6uuYjmzbfzsTdOTi1qu7MxXGOwVt1t5o95IvX-8fZdJFrLHDM0ZozTBAvberIQlyuS6gZXZcac1MqrCnW1tB3razFWqTiKkkxg7rkkJExuD7cbYP_7E0X5cb3oUmREgtIeUHTzaSCB5UOvuuCsbINbqvCTiIo98RkIib3xOSRWLLwPxbt4p5K-lq5-j_j1cHojDG_cggjQhDyAwU3eUQ |
CODEN | IEIMAO |
CitedBy_id | crossref_primary_10_1109_TIM_2024_3406835 crossref_primary_10_1109_TIM_2024_3352713 |
Cites_doi | 10.3390/electronics8091039 10.1016/j.clinph.2021.05.021 10.3758/s13423-011-0124-7 10.3390/brainsci9030050 10.1111/j.1469-8986.2008.00654.x 10.1038/s41598-020-74710-9 10.3390/brainsci13040685 10.3390/s20247088 10.1016/j.biopsycho.2007.07.005 10.1016/0191-8869(96)00050-5 10.1126/science.1171203 10.1007/978-3-030-01234-2_46 10.1109/TIM.2021.3076850 10.3390/ijerph16071236 10.1080/10615806.2019.1597859 10.1109/MECO52532.2021.9460191 10.1109/TAFFC.2018.2828819 10.1186/s12888-021-03197-z 10.1016/S0140-6736(21)02143-7 10.1109/ICCV.2019.00718 10.1109/CVPR.2018.00046 10.3390/brainsci11040480 10.1016/j.nicl.2019.101813 10.1016/j.bspc.2016.06.020 10.3390/jcm9103064 10.1109/IJCBS.2009.22 10.1109/TAFFC.2020.3021755 10.1002/da.21986 10.1109/ACII.2015.7344583 10.1007/s10608-014-9606-z 10.1016/j.bdr.2022.100314 10.1109/TAFFC.2018.2792000 10.1212/WNL.0000000000002499 10.1109/ICMIPE53131.2021.9698881 10.1016/j.patcog.2020.107197 10.1016/j.pmcj.2018.09.003 10.1145/3343031.3351015 10.1016/j.imu.2018.12.004 10.1109/TIM.2022.3205644 10.3390/s21123998 10.1109/JBHI.2020.2983126 10.1016/j.ins.2019.07.070 10.1007/978-3-030-01261-8_28 10.1109/TAFFC.2019.2936198 10.1109/ICCV48922.2021.00676 10.3390/s19173693 10.1109/TENCONSpring.2017.8069995 10.1016/j.jad.2020.01.032 10.1016/j.marpol.2022.105276 10.1109/TAFFC.2020.2981440 10.1016/j.eswa.2023.120135 10.1109/FG.2019.8756568 10.3390/pr8020155 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024 |
DBID | 97E RIA RIE AAYXX CITATION 7SP 7U5 8FD L7M |
DOI | 10.1109/TIM.2023.3341132 |
DatabaseName | IEEE Xplore (IEEE) IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef Electronics & Communications Abstracts Solid State and Superconductivity Abstracts Technology Research Database Advanced Technologies Database with Aerospace |
DatabaseTitle | CrossRef Solid State and Superconductivity Abstracts Technology Research Database Advanced Technologies Database with Aerospace Electronics & Communications Abstracts |
DatabaseTitleList | Solid State and Superconductivity Abstracts |
Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering Physics |
EISSN | 1557-9662 |
EndPage | 1 |
ExternalDocumentID | 10_1109_TIM_2023_3341132 10353993 |
Genre | orig-research |
GrantInformation_xml | – fundername: China Scholarship Council grantid: 202106690030 funderid: 10.13039/501100004543 – fundername: National Natural Science Foundation of China grantid: 72293581; 72293580; 72188101 funderid: 10.13039/501100001809 |
GroupedDBID | -~X 0R~ 29I 4.4 5GY 6IK 85S 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACIWK ACNCT AENEX AGQYO AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS F5P HZ~ IFIPE IPLJI JAVBF LAI M43 O9- OCL P2P RIA RIE RNS TN5 TWZ 5VS 8WZ A6W AAYOK AAYXX AETIX AGSQL AI. AIBXA ALLEH CITATION EJD H~9 IAAWW IBMZZ ICLAB IDIHD IFJZH RIG VH1 VJK 7SP 7U5 8FD L7M |
ID | FETCH-LOGICAL-c292t-1d6523168f2311f028d80c54d8c26e8a2c42cfe4bcaff2c92c96a231250c86053 |
IEDL.DBID | RIE |
ISSN | 0018-9456 |
IngestDate | Mon Jun 30 10:19:25 EDT 2025 Tue Jul 01 03:07:35 EDT 2025 Thu Apr 24 23:11:22 EDT 2025 Wed Aug 27 02:37:43 EDT 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c292t-1d6523168f2311f028d80c54d8c26e8a2c42cfe4bcaff2c92c96a231250c86053 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ORCID | 0000-0003-3113-9577 0009-0009-8779-490X 0000-0001-6725-6703 0000-0002-8384-1950 0000-0002-8825-0527 0009-0008-5921-9062 |
PQID | 2904674028 |
PQPubID | 85462 |
PageCount | 1 |
ParticipantIDs | crossref_citationtrail_10_1109_TIM_2023_3341132 crossref_primary_10_1109_TIM_2023_3341132 proquest_journals_2904674028 ieee_primary_10353993 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2024-01-01 |
PublicationDateYYYYMMDD | 2024-01-01 |
PublicationDate_xml | – month: 01 year: 2024 text: 2024-01-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | New York |
PublicationPlace_xml | – name: New York |
PublicationTitle | IEEE transactions on instrumentation and measurement |
PublicationTitleAbbrev | TIM |
PublicationYear | 2024 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref13 ref12 ref15 ref14 ref53 ref52 ref11 ref10 ref54 ref17 ref16 ref19 ref18 ref51 ref50 ref46 ref45 ref48 ref47 ref41 ref44 ref43 ref49 ref8 ref7 ref9 ref4 ref3 ref6 ref5 ref40 ref35 ref34 ref37 ref36 ref31 ref30 ref33 ref32 ref2 ref1 ref39 ref38 ref24 ref23 ref26 ref25 ref20 ref22 ref21 ref28 ref27 ref29 Mesquita (ref42); 33 |
References_xml | – ident: ref17 doi: 10.3390/electronics8091039 – ident: ref4 doi: 10.1016/j.clinph.2021.05.021 – ident: ref16 doi: 10.3758/s13423-011-0124-7 – ident: ref5 doi: 10.3390/brainsci9030050 – ident: ref24 doi: 10.1111/j.1469-8986.2008.00654.x – ident: ref7 doi: 10.1038/s41598-020-74710-9 – ident: ref8 doi: 10.3390/brainsci13040685 – ident: ref18 doi: 10.3390/s20247088 – ident: ref22 doi: 10.1016/j.biopsycho.2007.07.005 – ident: ref52 doi: 10.1016/0191-8869(96)00050-5 – ident: ref2 doi: 10.1126/science.1171203 – ident: ref31 doi: 10.1007/978-3-030-01234-2_46 – ident: ref43 doi: 10.1109/TIM.2021.3076850 – ident: ref54 doi: 10.3390/ijerph16071236 – ident: ref19 doi: 10.1080/10615806.2019.1597859 – ident: ref32 doi: 10.1109/MECO52532.2021.9460191 – ident: ref51 doi: 10.1109/TAFFC.2018.2828819 – ident: ref25 doi: 10.1186/s12888-021-03197-z – ident: ref1 doi: 10.1016/S0140-6736(21)02143-7 – ident: ref47 doi: 10.1109/ICCV.2019.00718 – ident: ref41 doi: 10.1109/CVPR.2018.00046 – ident: ref20 doi: 10.3390/brainsci11040480 – ident: ref29 doi: 10.1016/j.nicl.2019.101813 – ident: ref14 doi: 10.1016/j.bspc.2016.06.020 – ident: ref37 doi: 10.3390/jcm9103064 – ident: ref46 doi: 10.1109/IJCBS.2009.22 – ident: ref49 doi: 10.1109/TAFFC.2020.3021755 – ident: ref53 doi: 10.1002/da.21986 – ident: ref35 doi: 10.1109/ACII.2015.7344583 – ident: ref3 doi: 10.1007/s10608-014-9606-z – ident: ref34 doi: 10.1016/j.bdr.2022.100314 – volume: 33 start-page: 2220 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref42 article-title: Rethinking pooling in graph neural networks – ident: ref38 doi: 10.1109/TAFFC.2018.2792000 – ident: ref40 doi: 10.1212/WNL.0000000000002499 – ident: ref10 doi: 10.1109/ICMIPE53131.2021.9698881 – ident: ref45 doi: 10.1016/j.patcog.2020.107197 – ident: ref6 doi: 10.1016/j.pmcj.2018.09.003 – ident: ref30 doi: 10.1145/3343031.3351015 – ident: ref12 doi: 10.1016/j.imu.2018.12.004 – ident: ref39 doi: 10.1109/TIM.2022.3205644 – ident: ref15 doi: 10.3390/s21123998 – ident: ref50 doi: 10.1109/JBHI.2020.2983126 – ident: ref33 doi: 10.1016/j.ins.2019.07.070 – ident: ref44 doi: 10.1007/978-3-030-01261-8_28 – ident: ref27 doi: 10.1109/TAFFC.2019.2936198 – ident: ref48 doi: 10.1109/ICCV48922.2021.00676 – ident: ref21 doi: 10.3390/s19173693 – ident: ref13 doi: 10.1109/TENCONSpring.2017.8069995 – ident: ref26 doi: 10.1016/j.jad.2020.01.032 – ident: ref28 doi: 10.1016/j.marpol.2022.105276 – ident: ref23 doi: 10.1109/TAFFC.2020.2981440 – ident: ref36 doi: 10.1016/j.eswa.2023.120135 – ident: ref9 doi: 10.1109/FG.2019.8756568 – ident: ref11 doi: 10.3390/pr8020155 |
SSID | ssj0007647 |
Score | 2.399551 |
Snippet | The early detection of anxiety disorders is crucial in mitigating distress and enhancing outcomes for individuals with mental disorders. Deep learning methods... |
SourceID | proquest crossref ieee |
SourceType | Aggregation Database Enrichment Source Index Database Publisher |
StartPage | 1 |
SubjectTerms | Anxiety Anxiety disorders Data mining Deep learning Facial Video Understanding Feature extraction Feature Fusion Few-Shot Learning Machine learning Mental disorders Mouth Nonintrusive Anxiety Detection Physiology Signs and symptoms Spatiotemporal Feature Extraction Spatiotemporal phenomena Training |
Title | SFF-DA: Spatiotemporal Feature Fusion for Nonintrusively Detecting Anxiety |
URI | https://ieeexplore.ieee.org/document/10353993 https://www.proquest.com/docview/2904674028 |
Volume | 73 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LS8NAEB5sQdCDj1qxWmUPXjwkzT7y8lasoRbaS1voLSSbzcXSim1B_fXObpJSFUUIIZDdsJnZnZ2ZnfkG4BZ5miapdC0ndX00UBS1UsfPLB4GAc19NxHG4TYcef2pGMzcWZmsbnJhlFIm-EzZ-tGc5WdLudGuMlzhXAOp8hrU0HIrkrW2Ytf3RAGQSXEFo1pQnUk6YWfyNLR1mXCbo8ymnH3Zg0xRlR-S2Gwv0TGMqoEVUSXP9mad2vLjG2bjv0d-Akelokm6xcw4hT21aMDhDvxgA_ZN-KdcncFgHEVWr3tPxibAusSrmhOtIG5eFYk22qlGUMElI-2_1ZkaKCbn76Sn9CkEfo50F286_rMJ0-hx8tC3yioLlmQhW1s089AYpV6Q453mqG9kgSNdkQWSeSpImBRM5kqkMslzJkO8vASbou4kAzSG-DnUF8uFugDiUukqnnLpJ1TkgidoqTNXokWDggTftaBT0T2WJQS5roQxj40p4oQxcirWnIpLTrXgbtvjpYDf-KNtUxN-p11B8xa0K97G5QJdxSx0dJ0V_NvLX7pdwQF-XRTuljbUkbLqGhWQdXpjJt4nAY_UAA |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LS8NAEB58IOrBZ8Vq1T148ZA0-8rDW7GG-mgvrdBbSDabi6UVbUH99c5uUqmKIoQQyCbZzOzOfjM7D4Bz5GmWZko6XiYDVFA0dTIvyB0ehSEtApkKa3Dr9vzOg7gdymEVrG5jYbTW1vlMu-bS7uXnEzUzpjKc4dwkUuXLsIoLv6RluNan4A18UabIpDiHERjMdyW9qDm46bqmULjLUWpTzr6sQrasyg9ZbBeYeBt6866VfiWP7myauer9W9bGf_d9B7YqqEla5djYhSU93oPNhQSEe7BmHUDVyz7c9uPYabcuSd-6WFcZq0bEQMTZsybxzJjVCEJc0jMWXBOrgYJy9Eba2uxD4OtIa_xqPEBr8BBfD646TlVnwVEsYlOH5j6qo9QPCzzTAhFHHnpKijxUzNdhypRgqtAiU2lRMBXh4afYFNGTClEd4gewMp6M9SEQSZXUPOMqSKkoBE9RV2dSoU6DogTv1aE5p3uiqiTkphbGKLHKiBclyKnEcCqpOFWHi88nnsoEHH-0rRnCL7QraV6Hxpy3STVFXxIWeabSCv7t0S-PncF6Z9C9T-5venfHsIFfEqXxpQErSGV9gnBkmp3aQfgBPk_XSQ |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=SFF-DA%3A+Spatiotemporal+Feature+Fusion+for+Nonintrusively+Detecting+Anxiety&rft.jtitle=IEEE+transactions+on+instrumentation+and+measurement&rft.au=Mo%2C+Haimiao&rft.au=Li%2C+Yuchen&rft.au=Han%2C+Peng&rft.au=Liao%2C+Xiao&rft.date=2024-01-01&rft.pub=IEEE&rft.issn=0018-9456&rft.spage=1&rft.epage=1&rft_id=info:doi/10.1109%2FTIM.2023.3341132&rft.externalDocID=10353993 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0018-9456&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0018-9456&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0018-9456&client=summon |