Depression and Severity Detection Based on Body Kinematic Features: Using Kinect Recorded Skeleton Data of Simple Action
Relative limb movement is an important feature in assessing depression. In this study, we looked into whether a skeleton-mimetic task using natural stimuli may help people recognize depression. We innovatively used Kinect V2 to collect participant data. Sequential skeletal data was directly extracte...
Saved in:
Published in | Frontiers in neurology Vol. 13; p. 905917 |
---|---|
Main Authors | , , , , , , |
Format | Journal Article |
Language | English |
Published |
Frontiers Media S.A
30.06.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Relative limb movement is an important feature in assessing depression. In this study, we looked into whether a skeleton-mimetic task using natural stimuli may help people recognize depression. We innovatively used Kinect V2 to collect participant data. Sequential skeletal data was directly extracted from the original Kinect-3D and tetrad coordinates of the participant's 25 body joints. Two constructed skeletal datasets of whole-body joints (including binary classification and multi classification) were input into the proposed model for depression recognition after data preparation. We improved the temporal convolution network (TCN), creating novel spatial attention dilated TCN (SATCN) network that included a hierarchy of temporal convolution groups with different dilated convolution scales to capture important skeletal features and a spatial attention block for final result prediction. The depression and non-depression groups can be classified automatically with a maximum accuracy of 75.8% in the binary classification task, and 64.3% accuracy in the multi classification dataset to recognize more fine-grained identification of depression severity, according to experimental results. Our experiments and methods based on Kinect V2 can not only identify and screen depression patients but also effectively observe the recovery level of depression patients during the recovery process. For example, in the change from severe depression to moderate or mild depression multi classification dataset. |
---|---|
AbstractList | Relative limb movement is an important feature in assessing depression. In this study, we looked into whether a skeleton-mimetic task using natural stimuli may help people recognize depression. We innovatively used Kinect V2 to collect participant data. Sequential skeletal data was directly extracted from the original Kinect-3D and tetrad coordinates of the participant's 25 body joints. Two constructed skeletal datasets of whole-body joints (including binary classification and multi classification) were input into the proposed model for depression recognition after data preparation. We improved the temporal convolution network (TCN), creating novel spatial attention dilated TCN (SATCN) network that included a hierarchy of temporal convolution groups with different dilated convolution scales to capture important skeletal features and a spatial attention block for final result prediction. The depression and non-depression groups can be classified automatically with a maximum accuracy of 75.8% in the binary classification task, and 64.3% accuracy in the multi classification dataset to recognize more fine-grained identification of depression severity, according to experimental results. Our experiments and methods based on Kinect V2 can not only identify and screen depression patients but also effectively observe the recovery level of depression patients during the recovery process. For example, in the change from severe depression to moderate or mild depression multi classification dataset. Relative limb movement is an important feature in assessing depression. In this study, we looked into whether a skeleton-mimetic task using natural stimuli may help people recognize depression. We innovatively used Kinect V2 to collect participant data. Sequential skeletal data was directly extracted from the original Kinect-3D and tetrad coordinates of the participant's 25 body joints. Two constructed skeletal datasets of whole-body joints (including binary classification and multi classification) were input into the proposed model for depression recognition after data preparation. We improved the temporal convolution network (TCN), creating novel spatial attention dilated TCN (SATCN) network that included a hierarchy of temporal convolution groups with different dilated convolution scales to capture important skeletal features and a spatial attention block for final result prediction. The depression and non-depression groups can be classified automatically with a maximum accuracy of 75.8% in the binary classification task, and 64.3% accuracy in the multi classification dataset to recognize more fine-grained identification of depression severity, according to experimental results. Our experiments and methods based on Kinect V2 can not only identify and screen depression patients but also effectively observe the recovery level of depression patients during the recovery process. For example, in the change from severe depression to moderate or mild depression multi classification dataset.Relative limb movement is an important feature in assessing depression. In this study, we looked into whether a skeleton-mimetic task using natural stimuli may help people recognize depression. We innovatively used Kinect V2 to collect participant data. Sequential skeletal data was directly extracted from the original Kinect-3D and tetrad coordinates of the participant's 25 body joints. Two constructed skeletal datasets of whole-body joints (including binary classification and multi classification) were input into the proposed model for depression recognition after data preparation. We improved the temporal convolution network (TCN), creating novel spatial attention dilated TCN (SATCN) network that included a hierarchy of temporal convolution groups with different dilated convolution scales to capture important skeletal features and a spatial attention block for final result prediction. The depression and non-depression groups can be classified automatically with a maximum accuracy of 75.8% in the binary classification task, and 64.3% accuracy in the multi classification dataset to recognize more fine-grained identification of depression severity, according to experimental results. Our experiments and methods based on Kinect V2 can not only identify and screen depression patients but also effectively observe the recovery level of depression patients during the recovery process. For example, in the change from severe depression to moderate or mild depression multi classification dataset. |
Author | Liu, Xinxin Li, Wentao Zhao, Yue Zheng, Yunshao Ye, Jiayu Wang, Qingxiang Yu, Yanhong |
AuthorAffiliation | 4 Shandong Mental Health Center, Shandong University , Jinan , China 1 College of Traditional Chinese Medicine, Shandong University of Traditional Chinese Medicine , Jinan , China 3 The First Affiliated Hospital of Shandong First Medical University & Shandong Provincial Qianfoshan Hospital , Jinan , China 2 School of Computer Science and Technology, Qilu University of Technology , Jinan , China |
AuthorAffiliation_xml | – name: 3 The First Affiliated Hospital of Shandong First Medical University & Shandong Provincial Qianfoshan Hospital , Jinan , China – name: 2 School of Computer Science and Technology, Qilu University of Technology , Jinan , China – name: 1 College of Traditional Chinese Medicine, Shandong University of Traditional Chinese Medicine , Jinan , China – name: 4 Shandong Mental Health Center, Shandong University , Jinan , China |
Author_xml | – sequence: 1 givenname: Yanhong surname: Yu fullname: Yu, Yanhong – sequence: 2 givenname: Wentao surname: Li fullname: Li, Wentao – sequence: 3 givenname: Yue surname: Zhao fullname: Zhao, Yue – sequence: 4 givenname: Jiayu surname: Ye fullname: Ye, Jiayu – sequence: 5 givenname: Yunshao surname: Zheng fullname: Zheng, Yunshao – sequence: 6 givenname: Xinxin surname: Liu fullname: Liu, Xinxin – sequence: 7 givenname: Qingxiang surname: Wang fullname: Wang, Qingxiang |
BookMark | eNp9kk1vEzEQhleoiH7QH8DNRy5Jba_Xu-aAVBoKFZWQCD1bE3scXDbrYHur5t_jTSpEOeCLRzPzPuOP97Q6GsKAVfWG0Xldd-rCDTjGOaeczxVtFGtfVCdMSjHjXDVHf8XH1XlK97SsWqla1q-q47rpRMspO6keF7iNmJIPA4HBkiU-YPR5RxaY0eQp_QESWjIFwe7IFz_gBrI35Bohj0X7jtwlP6z3FZPJNzQh2qJY_sQec9EtIAMJjiz9ZtsjudxjX1cvHfQJz5_2s-ru-uP3q8-z26-fbq4ub2dGCJ5njVCyc3KlrHCgnEXZiBZr2kqjlDEUeCeVbFrWlAbOLAcFrVS1EyshmRX1WXVz4NoA93ob_QbiTgfwep8Ica0hluv0qC1lrVONA-GkQAogzKpGU0Z1HQc3sd4fWNtxtUFrcMgR-mfQ55XB_9Dr8KAVb5VUbQG8fQLE8GvElPXGJ4N9DwOGMWkuFRNNV36mtLJDq4khpYjuzxhG9WQAvTeAngygDwYomvYfjfEZptcup_H9f5S_AQ0huP8 |
CitedBy_id | crossref_primary_10_1038_s41398_023_02481_8 crossref_primary_10_1111_jocn_17694 crossref_primary_10_5498_wjp_v14_i2_225 |
Cites_doi | 10.1109/FG.2013.6553796 10.1109/ICASSP.2019.8682916 10.1109/ACII.2013.87 10.1007/BF00993583 10.1016/j.jad.2019.08.009 10.1371/journal.pone.0216591 10.1016/j.cviu.2018.04.007 10.1037/a0025737 10.1589/jpts.27.2057 10.1016/j.jad.2018.08.073 10.1109/ComManTel.2013.6482417 10.1145/3266302.3268997 10.1097/PSY.0b013e3181a2515c 10.3969/j.issn.1673-5374.2013.31.00310.3969/j.issn.1673-5374.2013.31.003 10.1109/TAFFC.2016.2573832 10.1016/S0140-6736(18)32279-7 10.1016/j.ridd.2011.07.002 10.1109/MMUL.2012.24 10.1007/s10919-010-0094-x 10.1016/j.jad.2021.08.090 10.1145/2382336.2382355 10.1145/3347320.3357696 10.1109/ISGS51981.2020.9375290 10.4324/9780429473678 10.1109/ACII.2013.31 10.1109/ACCESS.2019.2957179 10.1109/T-AFFC.2012.16 10.1007/978-3-642-70486-4_14 10.1109/TAFFC.2017.2724035 10.1016/S0262-4079(13)60796-4 10.1109/JBHI.2017.2676878 10.1016/j.jvcir.2018.11.003 10.1007/978-3-030-34869-4_3 10.1080/08990220.2018.1444599 10.1145/3347320.3357697 10.3390/jpm12040619 |
ContentType | Journal Article |
Copyright | Copyright © 2022 Yu, Li, Zhao, Ye, Zheng, Liu and Wang. Copyright © 2022 Yu, Li, Zhao, Ye, Zheng, Liu and Wang. 2022 Yu, Li, Zhao, Ye, Zheng, Liu and Wang |
Copyright_xml | – notice: Copyright © 2022 Yu, Li, Zhao, Ye, Zheng, Liu and Wang. – notice: Copyright © 2022 Yu, Li, Zhao, Ye, Zheng, Liu and Wang. 2022 Yu, Li, Zhao, Ye, Zheng, Liu and Wang |
DBID | AAYXX CITATION 7X8 5PM DOA |
DOI | 10.3389/fneur.2022.905917 |
DatabaseName | CrossRef MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Open Access Full Text |
DatabaseTitle | CrossRef MEDLINE - Academic |
DatabaseTitleList | MEDLINE - Academic CrossRef |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Open Access Full Text url: https://www.doaj.org/ sourceTypes: Open Website |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Medicine |
EISSN | 1664-2295 |
ExternalDocumentID | oai_doaj_org_article_d017f95fa4f64e0aa4cb3ec76c882af4 PMC9279697 10_3389_fneur_2022_905917 |
GrantInformation_xml | – fundername: ; – fundername: ; grantid: ZR2021MF079; ZR2020MF039 |
GroupedDBID | 53G 5VS 9T4 AAFWJ AAKDD AAYXX ACGFO ACGFS ACXDI ADBBV ADRAZ AENEX AFPKN ALMA_UNASSIGNED_HOLDINGS AOIJS BAWUL BCNDV CITATION DIK E3Z EMOBN F5P GROUPED_DOAJ GX1 HYE KQ8 M48 M~E O5R O5S OK1 P2P PGMZT RNS RPM 7X8 5PM |
ID | FETCH-LOGICAL-c442t-54968f6b9d4fa9fde6547e3076c99cc0a286965715d4f21d2a9a7693f4b461d43 |
IEDL.DBID | M48 |
ISSN | 1664-2295 |
IngestDate | Wed Aug 27 01:26:38 EDT 2025 Thu Aug 21 18:36:57 EDT 2025 Fri Jul 11 16:50:27 EDT 2025 Tue Jul 01 04:28:01 EDT 2025 Thu Apr 24 22:56:52 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Language | English |
License | This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c442t-54968f6b9d4fa9fde6547e3076c99cc0a286965715d4f21d2a9a7693f4b461d43 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 Edited by: Mariella Pazzaglia, Sapienza University of Rome, Italy Reviewed by: Benyue Su, Tongling University, China; Na Jin Seo, Medical University of South Carolina, United States This article was submitted to Neurology, a section of the journal Frontiers in Neurology These authors have contributed equally to this work and share first authorship |
OpenAccessLink | http://journals.scholarsportal.info/openUrl.xqy?doi=10.3389/fneur.2022.905917 |
PMID | 35847201 |
PQID | 2691458201 |
PQPubID | 23479 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_d017f95fa4f64e0aa4cb3ec76c882af4 pubmedcentral_primary_oai_pubmedcentral_nih_gov_9279697 proquest_miscellaneous_2691458201 crossref_primary_10_3389_fneur_2022_905917 crossref_citationtrail_10_3389_fneur_2022_905917 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2022-06-30 |
PublicationDateYYYYMMDD | 2022-06-30 |
PublicationDate_xml | – month: 06 year: 2022 text: 2022-06-30 day: 30 |
PublicationDecade | 2020 |
PublicationTitle | Frontiers in neurology |
PublicationYear | 2022 |
Publisher | Frontiers Media S.A |
Publisher_xml | – name: Frontiers Media S.A |
References | James (B1) 2018; 392 Zhao (B26) 2019; 14 Song (B38) 2015; 27 Wang (B16) 2018; 57 Fang (B27) 2019; 7 Shen (B31) 2012 Firth (B29) 2013 Ellgring (B28) 2007 Yin (B34) 2019 Dael (B9) 2012; 12 Du (B20) 2018 Ye (B32) 2021; 295 Wang (B41) 2018; 171 Lea (B40) 2017 Gross (B7) 2010; 34 Almasi (B36) 2020 Haque (B22) 2018 (B2) 2017 Joshi (B25) 2013 Dibeklioğlu (B23) 2017; 22 Davison (B15) 2016; 9 Joshi (B6) 2013 Du (B17) 2019 Natale (B30) 1980; 4 Pampouchidou (B12) 2017; 10 (B37) 2018; 35 Maggio (B11) 2022; 12 Zhang (B13) 2012; 19 Michalak (B8) 2009; 71 Rourke (B42) 1989 Lee (B24) 2018; 241 He (B45) 2016 Huang (B21) 2019 Pigoni (B3) 2019; 259 Stratou (B14) 2013 Ray (B33) 2019 Simonyan (B46) 2014 Bao (B35) 2013; 8 Cooper (B4) 2018 Kleinsmith (B5) 2012; 4 Hou (B44) 2018 Huang (B19) 2019 Hamilton (B43) 1986 Rathi (B18) 2019 Chang (B10) 2011; 32 Le (B39) 2013 |
References_xml | – start-page: 1 volume-title: 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG) year: 2013 ident: B6 article-title: Can body expressions contribute to automatic depression analysis? doi: 10.1109/FG.2013.6553796 – start-page: 5856 volume-title: ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) year: 2019 ident: B19 article-title: Speech landmark bigrams for depression detection from naturalistic smartphone speech doi: 10.1109/ICASSP.2019.8682916 – start-page: 492 volume-title: 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction year: 2013 ident: B25 article-title: Relative body parts movement for automatic depression analysis doi: 10.1109/ACII.2013.87 – volume: 4 start-page: 323 year: 1980 ident: B30 article-title: The effect of Velten's mood-induction procedure for depression on hand movement and head-down posture publication-title: Motivat Emot doi: 10.1007/BF00993583 – volume: 259 start-page: 21 year: 2019 ident: B3 article-title: Can Machine Learning help us in dealing with treatment resistant depression? A review publication-title: J Affect Disord doi: 10.1016/j.jad.2019.08.009 – volume: 14 start-page: e0216591 year: 2019 ident: B26 article-title: See your mental state from your walk: Recognizing anxiety and depression through Kinect-recorded gait data publication-title: PLoS ONE doi: 10.1371/journal.pone.0216591 – volume: 171 start-page: 118 year: 2018 ident: B41 article-title: RGB-D-based human motion recognition with deep learning: a survey publication-title: Comput Vision Image Understand doi: 10.1016/j.cviu.2018.04.007 – volume: 12 start-page: 1085 year: 2012 ident: B9 article-title: Emotion expression in body action and posture publication-title: Emotion doi: 10.1037/a0025737 – volume-title: arXiv [Preprint] arXiv: year: 2018 ident: B22 article-title: Measuring depression symptom severity from spoken language and 3D facial expressions – volume: 27 start-page: 2057 year: 2015 ident: B38 article-title: Effect of virtual reality games on stroke patients' balance, gait, depression, and interpersonal relationships publication-title: J Phys Therapy Sci doi: 10.1589/jpts.27.2057 – volume: 241 start-page: 519 year: 2018 ident: B24 article-title: Applications of machine learning algorithms to predict therapeutic outcomes in depression: a meta-analysis and systematic review publication-title: J Affect Disord doi: 10.1016/j.jad.2018.08.073 – volume-title: J Clin Psychiatry year: 1989 ident: B42 article-title: Treatment of seasonal depression with d-fenfluramine – start-page: 340 volume-title: 2013 International Conference on Computing, Management and Telecommunications (ComManTel) year: 2013 ident: B39 article-title: Human posture recognition using human skeleton provided by Kinect doi: 10.1109/ComManTel.2013.6482417 – start-page: 23 volume-title: Proceedings of the 2018 on Audio/Visual Emotion Challenge and Workshop year: 2018 ident: B20 article-title: Bipolar disorder recognition via multi-scale discriminative audio temporal representation doi: 10.1145/3266302.3268997 – start-page: 770 volume-title: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition year: 2016 ident: B45 article-title: Deep residual learning for image recognition – volume: 71 start-page: 580 year: 2009 ident: B8 article-title: Embodiment of sadness and depression-gait patterns associated with dysphoric mood publication-title: Psychosom. Med doi: 10.1097/PSY.0b013e3181a2515c – volume: 8 start-page: 2904 year: 2013 ident: B35 article-title: Mechanism of Kinect-based virtual reality training for motor functional recovery of upper limbs after subacute stroke publication-title: Neural Regen Res doi: 10.3969/j.issn.1673-5374.2013.31.00310.3969/j.issn.1673-5374.2013.31.003 – volume: 9 start-page: 116 year: 2016 ident: B15 article-title: SAMM: a spontaneous micro-facial movement dataset publication-title: IEEE Trans Affect Comput doi: 10.1109/TAFFC.2016.2573832 – volume: 392 start-page: 1789 year: 2018 ident: B1 article-title: Global, regional, and national incidence, prevalence, and years lived with disability for 354 diseases and injuries for 195 countries and territories, 1990-2017: a systematic analysis for the Global Burden of Disease Study 2017 publication-title: Lancet doi: 10.1016/S0140-6736(18)32279-7 – volume: 32 start-page: 2566 year: 2011 ident: B10 article-title: A Kinect-based system for physical rehabilitation: a pilot study for young adults with motor disabilities publication-title: Res Dev Disabil doi: 10.1016/j.ridd.2011.07.002 – volume: 19 start-page: 4 year: 2012 ident: B13 article-title: Microsoft kinect sensor and its effect publication-title: IEEE Multimedia doi: 10.1109/MMUL.2012.24 – volume-title: Made in Viet Nam Vaccines: Efforts to Develop Sustainable In-Country Manufacturing for Seasonal and Pandemic Influenza Vaccines: Consultation Held in Viet Nam year: 2017 ident: B2 – volume: 34 start-page: 223 year: 2010 ident: B7 article-title: Methodology for assessing bodily expression of emotion publication-title: J Nonverb Behav doi: 10.1007/s10919-010-0094-x – volume: 295 start-page: 904 year: 2021 ident: B32 article-title: Multi-modal depression detection based on emotional audio and evaluation text publication-title: J Affect Disord doi: 10.1016/j.jad.2021.08.090 – volume-title: arXiv [Preprint] arXiv: year: 2014 ident: B46 article-title: Very deep convolutional networks for large-scale image recognition – start-page: 66 volume-title: Proceedings of the 4th International Conference on Internet Multimedia Computing and Service year: 2012 ident: B31 article-title: Unsupervised human skeleton extraction from Kinect depth images doi: 10.1145/2382336.2382355 – start-page: 65 volume-title: Proceedings of the 9th International on Audio/Visual Emotion Challenge and Workshop year: 2019 ident: B34 article-title: A multi-modal hierarchical recurrent neural network for depression detection doi: 10.1145/3347320.3357696 – start-page: 51 volume-title: 2020 International Serious Games Symposium (ISGS) year: 2020 ident: B36 article-title: Kinect-based virtual rehabilitation for upper extremity motor recovery in chronic stroke doi: 10.1109/ISGS51981.2020.9375290 – volume-title: Diagnosing the Diagnostic and Statistical Manual of Mental Disorders year: 2018 ident: B4 doi: 10.4324/9780429473678 – start-page: 1 volume-title: 2019 14th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2019) year: 2019 ident: B17 article-title: Encoding visual behaviors with attentive temporal convolution for depression prediction – start-page: 147 volume-title: 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction year: 2013 ident: B14 article-title: Automatic nonverbal behavior indicators of depression and PTSD: exploring gender differences doi: 10.1109/ACII.2013.31 – volume: 7 start-page: 174425 year: 2019 ident: B27 article-title: Depression prevalence in postgraduate students and its association with gait abnormality publication-title: IEEE Access doi: 10.1109/ACCESS.2019.2957179 – volume-title: IEEE Trans Affect Comput year: 2019 ident: B21 article-title: Investigation of speech landmark patterns for depression detection – start-page: 156 volume-title: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition year: 2017 ident: B40 article-title: Temporal convolutional networks for action segmentation and detection – volume: 4 start-page: 15 year: 2012 ident: B5 article-title: Affective body expression perception and recognition: a survey publication-title: IEEE Trans Affect Comput doi: 10.1109/T-AFFC.2012.16 – start-page: 143 volume-title: Assessment of Depression year: 1986 ident: B43 article-title: The Hamilton rating scale for depression doi: 10.1007/978-3-642-70486-4_14 – volume: 10 start-page: 445 year: 2017 ident: B12 article-title: Automatic assessment of depression based on visual cues: a systematic review publication-title: IEEE Trans Affect Comput doi: 10.1109/TAFFC.2017.2724035 – volume-title: Computers Diagnose Depression From our Body Language year: 2013 ident: B29 doi: 10.1016/S0262-4079(13)60796-4 – volume-title: Proceedings of the European Conference on Computer Vision (ECCV) year: 2018 ident: B44 article-title: Spatial-temporal attention res-TCN for skeleton-based dynamic hand gesture recognition – volume: 22 start-page: 525 year: 2017 ident: B23 article-title: Dynamic multimodal measurement of depression severity using deep autoencoding publication-title: IEEE J Biomed Health Inform doi: 10.1109/JBHI.2017.2676878 – volume: 57 start-page: 228 year: 2018 ident: B16 article-title: Facial expression video analysis for depression detection in Chinese patients publication-title: J Visual Commun Image Represent doi: 10.1016/j.jvcir.2018.11.003 – start-page: 22 volume-title: International Conference on Pattern Recognition and Machine Intelligence year: 2019 ident: B18 article-title: Enhanced depression detection from facial cues using univariate feature selection techniques doi: 10.1007/978-3-030-34869-4_3 – volume: 35 start-page: 25 year: 2018 ident: B37 article-title: Effects of Kinect-based virtual reality game training on upper extremity motor recovery in chronic stroke publication-title: Somatosens Motor Res doi: 10.1080/08990220.2018.1444599 – start-page: 81 volume-title: Proceedings of the 9th International on Audio/Visual Emotion Challenge and Workshop year: 2019 ident: B33 article-title: Multi-level attention network using text, audio and video for depression prediction doi: 10.1145/3347320.3357697 – volume: 12 start-page: 619 year: 2022 ident: B11 article-title: Body representation in patients with severe spinal cord injury: a pilot study on the promising role of powered exoskeleton for gait training publication-title: J Pers Med doi: 10.3390/jpm12040619 – volume-title: Non-Verbal Communication in Depression year: 2007 ident: B28 |
SSID | ssj0000399363 |
Score | 2.3074353 |
Snippet | Relative limb movement is an important feature in assessing depression. In this study, we looked into whether a skeleton-mimetic task using natural stimuli may... |
SourceID | doaj pubmedcentral proquest crossref |
SourceType | Open Website Open Access Repository Aggregation Database Enrichment Source Index Database |
StartPage | 905917 |
SubjectTerms | deep learning depression recognition human skeleton kinect sensor Neurology temporal convolution network |
SummonAdditionalLinks | – databaseName: DOAJ Open Access Full Text dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1LT9wwELYqDqgXBLSI5SVX6qlSIPE6TsxtYYtQK3pZkLhZ45dArbIIshL8e2acLGwu7aU3K_Y4jmecmdGMv2Hsa_QyShkhE6WDTEZs2VrqrHCgbG6ldemi8NUvdXkjf9yWtyulvignrIMH7jbuxKPIRF1GkFHJkANIZ8fBVcqhbQgxIYGizltxptI_mPSuGndhTPTC9EkkfEj0B4U41mhSpAJl74oo4fUPjMxhiuSKzrnYZBu9scgn3SK32IfQbLP1qz4c_ok9T5d5rA2HxvNZQMFEs5pPQ5tSrBp-hlrKc2rM_Qv_iWQJo5WT6bdA2lOekgZSj2t5540ixew36iO0C_kUWuDzyGf3hCPMJ2naz-zm4vv1-WXWl1LInJSizdALVHVUViNrQEcfqOZwwPOtnNbO5SBqpVVZFSUOEIUXoIGqJEZppSq8HO-wtWbehF3GVaQbhnosocapvaohV0EXLrdexFJUI5Yv99W4Hmecyl38MehvECtMYoUhVpiOFSP27Y3koQPZ-NvgM2LW20DCx04PUGpMLzXmX1IzYl-WrDZ4nihIAk2YL56MULqgWGJejFg1kIHBG4c9zf1dQubWotJKV3v_Y4n77CN9dZebeMDW2sdFOEQDqLVHSdZfAV1PCBA priority: 102 providerName: Directory of Open Access Journals |
Title | Depression and Severity Detection Based on Body Kinematic Features: Using Kinect Recorded Skeleton Data of Simple Action |
URI | https://www.proquest.com/docview/2691458201 https://pubmed.ncbi.nlm.nih.gov/PMC9279697 https://doaj.org/article/d017f95fa4f64e0aa4cb3ec76c882af4 |
Volume | 13 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjR1da9wwzHQdjL2MfbJbt-LBngbpEp_ixIMx2t1K2bi93A76Zvy5lZakveag_feTnFxZoJS-mdhSYsuypEiWGPsQPUSAaDJROpNBxJatQWWFM9LmFqxLF4Xnv-TREn4cl8dbbFPealjAy1tNO6ontVyd7V1dXH9Fhv9CFifK20-RUj-iqSfEnkJtoagesIcomCri0_mg7aeDmYRxqq1WSAkZFbLu_Zy3YxlJqpTQf6SFjmMo_xNKh0_Zk0Gb5Ps9-Z-xrdA8Z4_mg7_8BbuabQJdG24azxcBdy7q3XwWuhSD1fADFGOeU6P11_wngqUkrpx0wzXCfuYpqiD1uI735ipCLE5RYKHiyGemM7yNfHFCiYb5fkL7ki0Pv__-dpQNtRYyByC6DM1EWUdpFdLOqOgDFSUOeABIp5RzuRG1VLKsihIHiMILowyVUYxgQRYepq_YdtM24TXjMtIVRDUFUyNqL2uTy6AKl1svYimqCcs366rdkIic6mGcaTRIiBQ6kUITKXRPign7eANy3mfhuGvwARHrZiAl0E4P2tUfPfCj9ngS4XdGA1FCyI0BZ6fB4XTR5DARJuz9htQaGY68KKYJ7fpSC6kKcjbmxYRVoz0weuO4pzn5m1J3K1Epqao398C-wx7TpPrYxLdsu1utwztUgDq7m34c7KbN_Q-dZwcb |
linkProvider | Scholars Portal |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Depression+and+Severity+Detection+Based+on+Body+Kinematic+Features%3A+Using+Kinect+Recorded+Skeleton+Data+of+Simple+Action&rft.jtitle=Frontiers+in+neurology&rft.au=Yu%2C+Yanhong&rft.au=Li%2C+Wentao&rft.au=Zhao%2C+Yue&rft.au=Ye%2C+Jiayu&rft.date=2022-06-30&rft.issn=1664-2295&rft.eissn=1664-2295&rft.volume=13&rft.spage=905917&rft_id=info:doi/10.3389%2Ffneur.2022.905917&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1664-2295&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1664-2295&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1664-2295&client=summon |