Enhanced residual attention-based subject-specific network (ErAS-Net): facial expression-based pain classification with multiple attention mechanisms

The automatic detection of pain through the analysis of facial expressions is indeed one of the most critical challenges in the healthcare system. One of the significant challenges in automatic pain detection from facial expressions is the variability in how individuals express pain and other emotio...

Full description

Saved in:
Bibliographic Details
Published inScientific reports Vol. 15; no. 1; pp. 19425 - 16
Main Authors Morsali, Mahdi, Ghaffari, Aboozar
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group UK 03.06.2025
Nature Publishing Group
Nature Portfolio
Subjects
Online AccessGet full text

Cover

Loading…
Abstract The automatic detection of pain through the analysis of facial expressions is indeed one of the most critical challenges in the healthcare system. One of the significant challenges in automatic pain detection from facial expressions is the variability in how individuals express pain and other emotions through their facial deformations. This research aims to solve this issue by presenting ErAS-Net, an Enhanced Residual Attention-Based Subject-Specific Network that employs various attention mechanisms. Through transfer learning and multiple attention mechanisms, the proposed deep learning model is designed to mimic human perception of facial expressions, thereby enhancing its pain recognition ability and capturing the unique features of each individual’s facial expressions based on their specific patterns. The UNBC-McMaster Shoulder Pain dataset is used to demonstrate the effectiveness of the proposed deep learning algorithm, which achieves impressive values of 98.77% accuracy for binary classification and 94.21% for four-level pain intensity classification using tenfold cross-validation. Additionally, the model attained 89.83% accuracy for binary classification with the Leave-One-Subject-Out (LOSO) validation method. To further evaluate generalizability, a cross-dataset experiment was conducted using the BioVid Heat Pain Database, where ErAS-Net achieved 78.14% accuracy for binary pain detection on unseen data without fine-tuning. The fact that this finding supports the attention mechanism and human perception is why the proposed model proves to be a powerful and reliable tool for automatic pain detection.
AbstractList The automatic detection of pain through the analysis of facial expressions is indeed one of the most critical challenges in the healthcare system. One of the significant challenges in automatic pain detection from facial expressions is the variability in how individuals express pain and other emotions through their facial deformations. This research aims to solve this issue by presenting ErAS-Net, an Enhanced Residual Attention-Based Subject-Specific Network that employs various attention mechanisms. Through transfer learning and multiple attention mechanisms, the proposed deep learning model is designed to mimic human perception of facial expressions, thereby enhancing its pain recognition ability and capturing the unique features of each individual's facial expressions based on their specific patterns. The UNBC-McMaster Shoulder Pain dataset is used to demonstrate the effectiveness of the proposed deep learning algorithm, which achieves impressive values of 98.77% accuracy for binary classification and 94.21% for four-level pain intensity classification using tenfold cross-validation. Additionally, the model attained 89.83% accuracy for binary classification with the Leave-One-Subject-Out (LOSO) validation method. To further evaluate generalizability, a cross-dataset experiment was conducted using the BioVid Heat Pain Database, where ErAS-Net achieved 78.14% accuracy for binary pain detection on unseen data without fine-tuning. The fact that this finding supports the attention mechanism and human perception is why the proposed model proves to be a powerful and reliable tool for automatic pain detection.
The automatic detection of pain through the analysis of facial expressions is indeed one of the most critical challenges in the healthcare system. One of the significant challenges in automatic pain detection from facial expressions is the variability in how individuals express pain and other emotions through their facial deformations. This research aims to solve this issue by presenting ErAS-Net, an Enhanced Residual Attention-Based Subject-Specific Network that employs various attention mechanisms. Through transfer learning and multiple attention mechanisms, the proposed deep learning model is designed to mimic human perception of facial expressions, thereby enhancing its pain recognition ability and capturing the unique features of each individual's facial expressions based on their specific patterns. The UNBC-McMaster Shoulder Pain dataset is used to demonstrate the effectiveness of the proposed deep learning algorithm, which achieves impressive values of 98.77% accuracy for binary classification and 94.21% for four-level pain intensity classification using tenfold cross-validation. Additionally, the model attained 89.83% accuracy for binary classification with the Leave-One-Subject-Out (LOSO) validation method. To further evaluate generalizability, a cross-dataset experiment was conducted using the BioVid Heat Pain Database, where ErAS-Net achieved 78.14% accuracy for binary pain detection on unseen data without fine-tuning. The fact that this finding supports the attention mechanism and human perception is why the proposed model proves to be a powerful and reliable tool for automatic pain detection.The automatic detection of pain through the analysis of facial expressions is indeed one of the most critical challenges in the healthcare system. One of the significant challenges in automatic pain detection from facial expressions is the variability in how individuals express pain and other emotions through their facial deformations. This research aims to solve this issue by presenting ErAS-Net, an Enhanced Residual Attention-Based Subject-Specific Network that employs various attention mechanisms. Through transfer learning and multiple attention mechanisms, the proposed deep learning model is designed to mimic human perception of facial expressions, thereby enhancing its pain recognition ability and capturing the unique features of each individual's facial expressions based on their specific patterns. The UNBC-McMaster Shoulder Pain dataset is used to demonstrate the effectiveness of the proposed deep learning algorithm, which achieves impressive values of 98.77% accuracy for binary classification and 94.21% for four-level pain intensity classification using tenfold cross-validation. Additionally, the model attained 89.83% accuracy for binary classification with the Leave-One-Subject-Out (LOSO) validation method. To further evaluate generalizability, a cross-dataset experiment was conducted using the BioVid Heat Pain Database, where ErAS-Net achieved 78.14% accuracy for binary pain detection on unseen data without fine-tuning. The fact that this finding supports the attention mechanism and human perception is why the proposed model proves to be a powerful and reliable tool for automatic pain detection.
Abstract The automatic detection of pain through the analysis of facial expressions is indeed one of the most critical challenges in the healthcare system. One of the significant challenges in automatic pain detection from facial expressions is the variability in how individuals express pain and other emotions through their facial deformations. This research aims to solve this issue by presenting ErAS-Net, an Enhanced Residual Attention-Based Subject-Specific Network that employs various attention mechanisms. Through transfer learning and multiple attention mechanisms, the proposed deep learning model is designed to mimic human perception of facial expressions, thereby enhancing its pain recognition ability and capturing the unique features of each individual’s facial expressions based on their specific patterns. The UNBC-McMaster Shoulder Pain dataset is used to demonstrate the effectiveness of the proposed deep learning algorithm, which achieves impressive values of 98.77% accuracy for binary classification and 94.21% for four-level pain intensity classification using tenfold cross-validation. Additionally, the model attained 89.83% accuracy for binary classification with the Leave-One-Subject-Out (LOSO) validation method. To further evaluate generalizability, a cross-dataset experiment was conducted using the BioVid Heat Pain Database, where ErAS-Net achieved 78.14% accuracy for binary pain detection on unseen data without fine-tuning. The fact that this finding supports the attention mechanism and human perception is why the proposed model proves to be a powerful and reliable tool for automatic pain detection.
ArticleNumber 19425
Author Morsali, Mahdi
Ghaffari, Aboozar
Author_xml – sequence: 1
  givenname: Mahdi
  surname: Morsali
  fullname: Morsali, Mahdi
  organization: Department of Electrical Engineering, Iran University of Science and Technology
– sequence: 2
  givenname: Aboozar
  surname: Ghaffari
  fullname: Ghaffari, Aboozar
  email: aboozar_ghaffari@iust.ac.ir
  organization: Department of Electrical Engineering, Iran University of Science and Technology
BackLink https://www.ncbi.nlm.nih.gov/pubmed/40461564$$D View this record in MEDLINE/PubMed
BookMark eNp9Ustu1DAUtVARLUN_gAWKxKYsAvEridlUVTVApQoWwNpy7Juph8QOtsPAh_C_eGZKp2WBN7buPefch89TdOS8A4Se4-o1rmj7JjLMRVtWhJcV45yUm0fohORnSSghR_fex-g0xnWVDyeCYfEEHbOK1ZjX7AT9Xrob5TSYIkC0ZlZDoVICl6x3ZadiTsS5W4NOZZxA297qwkHa-PCtOFuGi8_lR0iv3ha90jZz4eeUdeKBPCnrCj2oHMtUtZUtNjbdFOM8JDsNcChXjKBzLzaO8Rl63KshwuntvUBf3y2_XH4orz-9v7q8uC41EyyVbU01KGFMq-u-4bgTugbSaK4aynvDm7qmvBZd16i6r1oOdUWx6DtQxpiuw3SBrva6xqu1nIIdVfglvbJyF_BhJVVIVg8gzbYYUVQ3eXkNFgIzzqhuK40bIF2ftc73WtPcjWB0Hiqo4YHow4yzN3Llf0hMMGWUiaxwdqsQ_PcZYpKjjRqGQTnwc5SUYM4bup1qgV7-A137Obi8qx2KCorZdrwX91u66-Xv92cA2QN08DEG6O8guJJbm8m9zWS2mdzZTG4yie5JMYPdCsKh9n9YfwALCNhA
Cites_doi 10.1109/FG.2018.00044
10.1016/j.asoc.2021.107780
10.3390/app10031115
10.1109/JTEHM.2021.3116867
10.1109/FG47880.2020.00139
10.1016/j.eswa.2020.113305
10.1109/FG.2011.5771462
10.1109/TII.2022.3233650
10.1109/TII.2024.3414489
10.1109/TIP.2016.2537215
10.1007/s10586-022-03552-z
10.1007/s00371-021-02056-y
10.3390/s21093273
10.1007/s11263-019-01191-3
10.1109/EMBC.2016.7590729
10.1109/FG47880.2020.00060
10.21203/rs.3.rs-2627259/v1
10.1038/s41598-022-21380-4
10.1016/j.jvcir.2015.09.007
10.1109/JSEN.2025.3540415
10.1109/TPAMI.2019.2958341
10.1109/JSEN.2023.3303389
10.1109/TIP.2021.3093397
10.1515/teme-2019-0024
10.1109/ISIVC54825.2022.9800746
10.1007/978-3-642-33191-6_36
10.1109/ICME.2013.6607608
10.1109/TII.2022.3145862
10.1109/EMBC53108.2024.10781616
10.1109/TIM.2021.3067611
10.1109/CVPR.2016.90
10.1109/ACCESS.2021.3101396
10.1109/FG.2017.87
10.1007/978-3-319-46487-9_6
10.1007/978-3-030-01234-2_1
10.1109/TCYB.2017.2662199
10.1109/ROBIO55434.2022.10011731
ContentType Journal Article
Copyright The Author(s) 2025
2025. The Author(s).
Copyright Nature Publishing Group 2025
The Author(s) 2025 2025
Copyright_xml – notice: The Author(s) 2025
– notice: 2025. The Author(s).
– notice: Copyright Nature Publishing Group 2025
– notice: The Author(s) 2025 2025
DBID C6C
AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
3V.
7X7
7XB
88A
88E
88I
8FE
8FH
8FI
8FJ
8FK
ABUWG
AEUYN
AFKRA
AZQEC
BBNVY
BENPR
BHPHI
CCPQU
DWQXO
FYUFA
GHDGH
GNUQQ
HCIFZ
K9.
LK8
M0S
M1P
M2P
M7P
PHGZM
PHGZT
PIMPY
PJZUB
PKEHL
PPXIY
PQEST
PQGLB
PQQKQ
PQUKI
PRINS
Q9U
7X8
5PM
DOA
DOI 10.1038/s41598-025-04552-w
DatabaseName Springer Nature OA Free Journals
CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
ProQuest Central (Corporate)
ProQuest Health & Medical Collection
ProQuest Central (purchase pre-March 2016)
Biology Database (Alumni Edition)
Medical Database (Alumni Edition)
Science Database (Alumni Edition)
ProQuest SciTech Collection
ProQuest Natural Science Collection
Hospital Premium Collection
Hospital Premium Collection (Alumni Edition)
ProQuest Central (Alumni) (purchase pre-March 2016)
ProQuest Central (Alumni)
ProQuest One Sustainability (subscription)
ProQuest Central UK/Ireland
ProQuest Central Essentials
ProQuest Biological Science Collection
ProQuest Central
ProQuest Natural Science Collection
ProQuest One Community College
ProQuest Central Korea
Health Research Premium Collection
Health Research Premium Collection (Alumni)
ProQuest Central Student
SciTech Premium Collection
ProQuest Health & Medical Complete (Alumni)
ProQuest Biological Science Collection
ProQuest Health & Medical Collection
Medical Database
Science Database
Biological Science Database
ProQuest Central Premium
ProQuest One Academic
Publicly Available Content Database (subscription)
ProQuest Health & Medical Research Collection
ProQuest One Academic Middle East (New)
ProQuest One Health & Nursing
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Applied & Life Sciences
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central China
ProQuest Central Basic
MEDLINE - Academic
PubMed Central (Full Participant titles)
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
Publicly Available Content Database
ProQuest Central Student
ProQuest One Academic Middle East (New)
ProQuest Central Essentials
ProQuest Health & Medical Complete (Alumni)
ProQuest Central (Alumni Edition)
SciTech Premium Collection
ProQuest One Community College
ProQuest One Health & Nursing
ProQuest Natural Science Collection
ProQuest Central China
ProQuest Biology Journals (Alumni Edition)
ProQuest Central
ProQuest One Applied & Life Sciences
ProQuest One Sustainability
ProQuest Health & Medical Research Collection
Health Research Premium Collection
Health and Medicine Complete (Alumni Edition)
Natural Science Collection
ProQuest Central Korea
Health & Medical Research Collection
Biological Science Collection
ProQuest Central (New)
ProQuest Medical Library (Alumni)
ProQuest Science Journals (Alumni Edition)
ProQuest Biological Science Collection
ProQuest Central Basic
ProQuest Science Journals
ProQuest One Academic Eastern Edition
ProQuest Hospital Collection
Health Research Premium Collection (Alumni)
Biological Science Database
ProQuest SciTech Collection
ProQuest Hospital Collection (Alumni)
ProQuest Health & Medical Complete
ProQuest Medical Library
ProQuest One Academic UKI Edition
ProQuest One Academic
ProQuest One Academic (New)
ProQuest Central (Alumni)
MEDLINE - Academic
DatabaseTitleList MEDLINE

Publicly Available Content Database

MEDLINE - Academic

Database_xml – sequence: 1
  dbid: C6C
  name: Springer Nature OA Free Journals
  url: http://www.springeropen.com/
  sourceTypes: Publisher
– sequence: 2
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 3
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 4
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
– sequence: 5
  dbid: BENPR
  name: ProQuest Central
  url: https://www.proquest.com/central
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Biology
EISSN 2045-2322
EndPage 16
ExternalDocumentID oai_doaj_org_article_d863c2a3c7404719914543c80c17e2bf
PMC12134349
40461564
10_1038_s41598_025_04552_w
Genre Journal Article
GroupedDBID 0R~
4.4
53G
5VS
7X7
88E
88I
8FE
8FH
8FI
8FJ
AAFWJ
AAJSJ
AAKDD
AASML
ABDBF
ABUWG
ACGFS
ACUHS
ADBBV
ADRAZ
AENEX
AEUYN
AFKRA
AFPKN
ALIPV
ALMA_UNASSIGNED_HOLDINGS
AOIJS
AZQEC
BAWUL
BBNVY
BCNDV
BENPR
BHPHI
BPHCQ
BVXVI
C6C
CCPQU
DIK
DWQXO
EBD
EBLON
EBS
ESX
FYUFA
GNUQQ
GROUPED_DOAJ
GX1
HCIFZ
HH5
HMCUK
HYE
KQ8
LK8
M1P
M2P
M7P
M~E
NAO
OK1
PHGZM
PHGZT
PIMPY
PMFND
PQQKQ
PROAC
PSQYO
RNT
RNTTT
RPM
SNYQT
UKHRP
AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
PJZUB
PPXIY
PQGLB
3V.
7XB
88A
8FK
AARCD
K9.
M48
PKEHL
PQEST
PQUKI
PRINS
Q9U
7X8
5PM
PUEGO
ID FETCH-LOGICAL-c494t-863cea9dd8c6f751b9c6e27c5a735fd57663569bb7a6f085e60319fbeadddbb13
IEDL.DBID DOA
ISSN 2045-2322
IngestDate Wed Aug 27 01:14:38 EDT 2025
Thu Aug 21 18:25:39 EDT 2025
Fri Jul 11 17:09:21 EDT 2025
Wed Aug 13 07:31:34 EDT 2025
Mon Jul 21 05:59:04 EDT 2025
Thu Jul 03 08:38:47 EDT 2025
Wed Jun 04 01:10:25 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 1
Keywords Deep learning
Attention mechanisms
Facial deformation
Transfer learning
Facial expression
Pain detection
Language English
License 2025. The Author(s).
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c494t-863cea9dd8c6f751b9c6e27c5a735fd57663569bb7a6f085e60319fbeadddbb13
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
OpenAccessLink https://doaj.org/article/d863c2a3c7404719914543c80c17e2bf
PMID 40461564
PQID 3215393141
PQPubID 2041939
PageCount 16
ParticipantIDs doaj_primary_oai_doaj_org_article_d863c2a3c7404719914543c80c17e2bf
pubmedcentral_primary_oai_pubmedcentral_nih_gov_12134349
proquest_miscellaneous_3215573766
proquest_journals_3215393141
pubmed_primary_40461564
crossref_primary_10_1038_s41598_025_04552_w
springer_journals_10_1038_s41598_025_04552_w
PublicationCentury 2000
PublicationDate 2025-06-03
PublicationDateYYYYMMDD 2025-06-03
PublicationDate_xml – month: 06
  year: 2025
  text: 2025-06-03
  day: 03
PublicationDecade 2020
PublicationPlace London
PublicationPlace_xml – name: London
– name: England
PublicationTitle Scientific reports
PublicationTitleAbbrev Sci Rep
PublicationTitleAlternate Sci Rep
PublicationYear 2025
Publisher Nature Publishing Group UK
Nature Publishing Group
Nature Portfolio
Publisher_xml – name: Nature Publishing Group UK
– name: Nature Publishing Group
– name: Nature Portfolio
References 4552_CR5
4552_CR4
4552_CR7
4552_CR6
4552_CR1
4552_CR3
4552_CR2
C-S Jiang (4552_CR13) 2023; 19
4552_CR36
4552_CR9
T Hassan (4552_CR8) 2021; 43
4552_CR39
4552_CR38
4552_CR15
C-S Jiang (4552_CR14) 2023; 23
4552_CR32
4552_CR30
P Casti (4552_CR25) 2021; 70
Y Guo (4552_CR24) 2016; 25
Y Huang (4552_CR29) 2022; 38
CS Jiang (4552_CR12) 2024
E Othman (4552_CR16) 2021; 21
Y Guo (4552_CR18) 2021; 9
PD Barua (4552_CR35) 2022
J Ortigoso-Narro (4552_CR41) 2025; 25
A Semwal (4552_CR21) 2021; 112
4552_CR22
A Ghosh (4552_CR23) 2023; 26
P Rodriguez (4552_CR27) 2022; 52
K Pikulkaew (4552_CR33) 2021; 9
G Bargshady (4552_CR26) 2020; 149
S El Morabit (4552_CR31) 2021; 10
A Martínez (4552_CR17) 2020; 10
J Wu (4552_CR40) 2023
M Tavakolian (4552_CR28) 2019; 127
4552_CR43
4552_CR20
4552_CR42
MD Putro (4552_CR11) 2022; 18
N Rathee (4552_CR34) 2015; 33
H Pedersen (4552_CR19) 2015
Z Zhao (4552_CR37) 2021; 30
K Weitz (4552_CR10) 2019; 86
References_xml – ident: 4552_CR7
  doi: 10.1109/FG.2018.00044
– volume: 112
  year: 2021
  ident: 4552_CR21
  publication-title: Appl. Soft Comput.
  doi: 10.1016/j.asoc.2021.107780
– volume: 10
  start-page: 1115
  year: 2020
  ident: 4552_CR17
  publication-title: Appl. Sci.
  doi: 10.3390/app10031115
– volume: 9
  start-page: 1
  year: 2021
  ident: 4552_CR18
  publication-title: IEEE J. Transl. Eng. Health Med.
  doi: 10.1109/JTEHM.2021.3116867
– ident: 4552_CR5
  doi: 10.1109/FG47880.2020.00139
– volume: 149
  start-page: 113305
  year: 2020
  ident: 4552_CR26
  publication-title: Expert Syst. Appl.
  doi: 10.1016/j.eswa.2020.113305
– ident: 4552_CR1
  doi: 10.1109/FG.2011.5771462
– volume: 19
  start-page: 9943
  year: 2023
  ident: 4552_CR13
  publication-title: IEEE Trans. Ind. Inform
  doi: 10.1109/TII.2022.3233650
– year: 2024
  ident: 4552_CR12
  publication-title: IEEE Trans. Ind. Inform.
  doi: 10.1109/TII.2024.3414489
– volume: 25
  start-page: 1977
  year: 2016
  ident: 4552_CR24
  publication-title: IEEE Trans. Image Process.
  doi: 10.1109/TIP.2016.2537215
– volume: 26
  start-page: 119
  year: 2023
  ident: 4552_CR23
  publication-title: Cluster Comput.
  doi: 10.1007/s10586-022-03552-z
– volume: 38
  start-page: 871
  year: 2022
  ident: 4552_CR29
  publication-title: Vis. Comput.
  doi: 10.1007/s00371-021-02056-y
– ident: 4552_CR4
– volume: 21
  start-page: 3273
  year: 2021
  ident: 4552_CR16
  publication-title: Sensors
  doi: 10.3390/s21093273
– volume: 127
  start-page: 1413
  year: 2019
  ident: 4552_CR28
  publication-title: Int. J. Comput. Vis.
  doi: 10.1007/s11263-019-01191-3
– ident: 4552_CR30
  doi: 10.1109/EMBC.2016.7590729
– ident: 4552_CR43
  doi: 10.1109/FG47880.2020.00060
– start-page: 128
  volume-title: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
  year: 2015
  ident: 4552_CR19
– year: 2023
  ident: 4552_CR40
  publication-title: J. Electron. Sci. Technol.
  doi: 10.21203/rs.3.rs-2627259/v1
– year: 2022
  ident: 4552_CR35
  publication-title: Sci. Rep.
  doi: 10.1038/s41598-022-21380-4
– volume: 33
  start-page: 247
  year: 2015
  ident: 4552_CR34
  publication-title: J. Vis. Commun. Image Represent.
  doi: 10.1016/j.jvcir.2015.09.007
– volume: 25
  start-page: 1
  year: 2025
  ident: 4552_CR41
  publication-title: IEEE Sens. J.
  doi: 10.1109/JSEN.2025.3540415
– volume: 43
  start-page: 1815
  year: 2021
  ident: 4552_CR8
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
  doi: 10.1109/TPAMI.2019.2958341
– volume: 23
  start-page: 22093
  year: 2023
  ident: 4552_CR14
  publication-title: IEEE Sens. J.
  doi: 10.1109/JSEN.2023.3303389
– volume: 30
  start-page: 6544
  year: 2021
  ident: 4552_CR37
  publication-title: IEEE Trans. Image Process.
  doi: 10.1109/TIP.2021.3093397
– volume: 86
  start-page: 404
  year: 2019
  ident: 4552_CR10
  publication-title: tm - Technisches Messen
  doi: 10.1515/teme-2019-0024
– ident: 4552_CR32
  doi: 10.1109/ISIVC54825.2022.9800746
– ident: 4552_CR2
  doi: 10.1007/978-3-642-33191-6_36
– ident: 4552_CR20
  doi: 10.1109/ICME.2013.6607608
– volume: 18
  start-page: 7665
  year: 2022
  ident: 4552_CR11
  publication-title: IEEE Trans. Ind. Inform.
  doi: 10.1109/TII.2022.3145862
– ident: 4552_CR15
– volume: 10
  start-page: 1926
  year: 2021
  ident: 4552_CR31
  publication-title: Electronics (Switzerland)
– ident: 4552_CR42
  doi: 10.1109/EMBC53108.2024.10781616
– ident: 4552_CR3
– volume: 70
  start-page: 1
  year: 2021
  ident: 4552_CR25
  publication-title: IEEE Trans. Instrum. Meas.
  doi: 10.1109/TIM.2021.3067611
– ident: 4552_CR38
  doi: 10.1109/CVPR.2016.90
– volume: 9
  start-page: 109903
  year: 2021
  ident: 4552_CR33
  publication-title: IEEE Access
  doi: 10.1109/ACCESS.2021.3101396
– ident: 4552_CR6
  doi: 10.1109/FG.2017.87
– ident: 4552_CR36
  doi: 10.1007/978-3-319-46487-9_6
– ident: 4552_CR39
  doi: 10.1007/978-3-030-01234-2_1
– volume: 52
  start-page: 3314
  year: 2022
  ident: 4552_CR27
  publication-title: IEEE Trans. Cybern.
  doi: 10.1109/TCYB.2017.2662199
– ident: 4552_CR9
– ident: 4552_CR22
  doi: 10.1109/ROBIO55434.2022.10011731
SSID ssj0000529419
Score 2.4486945
Snippet The automatic detection of pain through the analysis of facial expressions is indeed one of the most critical challenges in the healthcare system. One of the...
Abstract The automatic detection of pain through the analysis of facial expressions is indeed one of the most critical challenges in the healthcare system. One...
SourceID doaj
pubmedcentral
proquest
pubmed
crossref
springer
SourceType Open Website
Open Access Repository
Aggregation Database
Index Database
Publisher
StartPage 19425
SubjectTerms 639/166/985
639/166/987
Accuracy
Algorithms
Attention
Attention mechanisms
Classification
Deep Learning
Facial deformation
Facial Expression
Humanities and Social Sciences
Humans
multidisciplinary
Pain
Pain - classification
Pain - diagnosis
Pain detection
Pain Measurement - methods
Pain perception
Science
Science (multidisciplinary)
Transfer learning
SummonAdditionalLinks – databaseName: ProQuest Health & Medical Collection
  dbid: 7X7
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1Lb9QwELagCIkL4k2gICNxAIHVJH4lXFBBW1VI9AKV9mb5lXYPzS6bXbX9IfxfZpxkV8vrGtuKk5nxjOfxDSGvtQ65gjFmY3BMxMBZXeWehSAVXJ-j4hb9HV9P1PGp-DKV08Hh1g1pleOZmA7qMPfoIz_goJt4zQtRfFz8YNg1CqOrQwuNm-QWQpchV-up3vhYMIolinqolcl5ddCBvsKaslIysGVkyS539FGC7f-brflnyuRvcdOkjo7ukbuDHUkPe8LfJzdi-4Dc7jtLXj8kPyfteYrtU7hOp3orikCaKbWRoeYKtFs79MEwrLXEfCHa9hnh9M1kefiNncTV2w-0sehSp_FqyJcdFy_srKUeLW9cmqhL0aVLxwTF7evoRcTq4ll30T0ip0eT75-P2dCBgXlRixWrFPfR1iFUXjVaFq72KpbaS6u5bALcVRDernZOW9WA8RaxZ3XdOGDPEJwr-GOy187b-JTQyqkoc6fhxgQWhdTWVh7OmihyK10hYkbejXQwix5ow6QAOa9MTzUDVDOJauYyI5-QVJuZCJKdHsyXZ2aQORNw-6XlXoscdDBYwkIK7oEVCx1L12RkfyS0GSS3M1s-y8irzTDIHAZSbBvn636O1HA0q4w86flisxOBCPZSiYxUOxyzs9XdkXZ2nnC9E7oeF3VG3o_Mtd3Xv__Fs_9_xnNyp0z8rljO98nearmOL8CQWrmXSVp-AeSPH3U
  priority: 102
  providerName: ProQuest
– databaseName: Springer Nature HAS Fully OA
  dbid: AAJSJ
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9QwELZKKyQuiHdTCjISBxBYJPEr4bagraqV6KVU6s3yK3QPzVabXZX-EP4vM06y1UI5cF3bymhn7Pk8j8-EvNU65ArGmI3BMREDZ3WVexaCVHB9jopbjHd8O1HHZ2J2Ls93SDn2wqSi_URpmY7psTrsUweOBpvBSskAhMiSXd8je0jVDra9N5nMTmebyArmrkRRDx0yOa_uWLzlhRJZ_10I8-9CyT-ypckJHT0iDwf0SCe9vI_JTmyfkPv9e5I3T8mvaXuRMvoULtGpy4oifWYqaGTorwLt1g4jLww7LLFKiLZ9HTh9N11OTtlJXL3_TBuLgXQafw5VsuPiKztvqUe8jUuTTikGculYlnj7OXoZsad43l12z8jZ0fT712M2vLvAvKjFilWK-2jrECqvGi0LV3sVS-2l1Vw2AW4oSGpXO6etagCyRXypum4cGGUIzhX8OdltF23cJ7RyKsrcabgnAY6Q2trKwwkTRW6lK0TMyIdRD-aqp9cwKS3OK9NrzYDWTNKauc7IF1TVZiZSY6cfFssfZjAVE1D80nKvRQ6eF_CvkIJ7MMBCx9I1GTkcFW2G_doZDsiH17wQRUbebIZhp2H6xLZxse7nSA0HssrIi94uNpII5K2XSmSk2rKYLVG3R9r5RWLzTpx6XNQZ-Tga161c__4vDv5v-kvyoEz2r1jOD8nuarmOrwBOrdzrYf_8BiOiHzY
  priority: 102
  providerName: Springer Nature
Title Enhanced residual attention-based subject-specific network (ErAS-Net): facial expression-based pain classification with multiple attention mechanisms
URI https://link.springer.com/article/10.1038/s41598-025-04552-w
https://www.ncbi.nlm.nih.gov/pubmed/40461564
https://www.proquest.com/docview/3215393141
https://www.proquest.com/docview/3215573766
https://pubmed.ncbi.nlm.nih.gov/PMC12134349
https://doaj.org/article/d863c2a3c7404719914543c80c17e2bf
Volume 15
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1La9wwEBZtSqGX0nfdposKPbS0Irb1snvbLBvCQpfSNLA3I1ky2UOcEO-S5ofk_3ZGsjfZPuilJ4Ml2UIz0nwjzXwi5J3WLlVQxox3lgnvOCuLtGbOSQXus1fc4H7Hl7k6PBazhVzcuuoLY8IiPXAcuD1XKF7nhtdapLCQApwRUvAavpdpn9sGV1-webecqcjqnZciK_ssmZQXex1YKswmyyUDFCNzdrlliQJh_59Q5u_Bkr-cmAZDdPCIPOwRJB3Hnj8md3z7hNyPd0pePSXX0_YknOpTcKRDphVFCs0Q1MjQZjnarS3uvjDMssRIIdrGWHD6fnoxPmJzv_rwmTYGN9Op_9FHyg6Nz82ypTVibmwa5EpxM5cOoYk3v6OnHvOKl91p94wcH0y_Tw5Zf_cCq0UpVgxH3ZvSuaJWjZaZLWvlc11Lo7lsHHgpSGxXWquNagC2ebytumwsKKZz1mb8Odlpz1r_ktDCKi9Tq8FXAiwhtTFFDauMF6mRNhM-IR8HOVTnkWKjCkfjvKii1CqQWhWkVl0mZB9FtamJ9NjhBShN1StN9S-lScjuIOiqn7NdxQH98JJnIkvI200xzDY8QjGtP1vHOlLDoqwS8iLqxaYnArnrpRIJKbY0Zqur2yXt8iQwegdePS7KhHwalOumX38fi1f_Yyxekwd5mBWKpXyX7Kwu1v4NAK2VHZG7eqFH5N54PDuawXN_Ov_6Dd5O1GQU5ttPxcwr4w
linkProvider Directory of Open Access Journals
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9QwELZKEYIL4k2ggJFAAoHVJH4lSAgVaLWl7V5opb0FO3boHppd9qGlP4S_wW9kxkl2tbxuvcZx4mTGnm_ehDzT2sUKxpjxzjLhHWd5FpfMOalAffaKG7R3HPVV70R8GsjBBvnZ5cJgWGV3JoaD2o1KtJFvc5BNPOeJSN6NvzHsGoXe1a6FRsMWB_58ASrb9O3-R6Dv8zTd2z3-0GNtVwFWilzMWKZ46U3uXFaqSsvE5qXyqS6l0VxWDvA3lmzLrdVGVQBIPPZhzisLv9w5axMOz71ELoPgjVHZ0wO9tOmg10wkeZubE_NsewryEXPYUskAO8mULdbkX2gT8Dds-2eI5m9-2iD-9m6Q6y1upTsNo90kG76-Ra40nSzPb5Mfu_VpiCWgoL6H_C6KhTtDKCVDSenodG7R5sMwtxPjk2jdRKDTF7uTnc-s72cv39DKoAmf-u9tfG43eWyGNS0R6ePUwE0UTci0C4hcvY6eecxmHk7PpnfIyYXQ5i7ZrEe1v09oZpWXsdWgoQGCkdqYrISzzYvYSJsIH5FXHR2KcVPYowgOeZ4VDdUKoFoRqFYsIvIeSbW8E4tyhwujydei3eOFw-WnhpdaxCDzAXkLKXgJrJ9on9oqIlsdoYv2pJgWK76OyNPlMOxxdNyY2o_mzT1SgyhQEbnX8MVyJQIr5kslIpKtcczaUtdH6uFpqCMeqvlxkUfkdcdcq3X9-188-P9nPCFXe8dHh8Xhfv_gIbmWBt5XLOZbZHM2mftHAOJm9nHYOZR8ueit-gs9xl3Q
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Zb9NAEF6VVCBeEDeGAosEEghWsb2HbSSEWpqopRBV0Ep9M7veNc1DnZBDoT-EP8OvY2ZtJwrXW1-9drzxzOx8cxPyNElsqGCNaWcNE85ylqVhwayVCsxnp7hGf8fHgdo7Fu9P5MkG-dnWwmBaZXsm-oPajgr0kXc56Cae8UhE3bJJizjc7b8df2M4QQojre04jZpFDtz5Asy36Zv9XaD1szju947e7bFmwgArRCZmLFW8cDqzNi1UmcjIZIVycVJInXBZWsDi2L4tMybRqgRw4nAmc1Ya-PzWGhNx-N1LZDNBq6hDNnd6g8NPSw8PxtBElDWVOiFPu1PQlljRFksGSErGbLGmDf3QgL8h3T8TNn-L2npl2L9OrjUolm7XbHeDbLjqJrlcz7U8v0V-9KpTn1lAwZj31V4U23j6xEqGetPS6dygB4hhpSdmK9Gqzkenz3uT7c9s4GYvXtNSo0Ofuu9Ntm778FgPK1og7sdHPW9RdCjTNj1y9Tp65rC2eTg9m94mxxdCnTukU40qd4_Q1CgnQ5OAvQZ4RiZapwWcdE6EWppIuIC8bOmQj-s2H7kPz_M0r6mWA9VyT7V8EZAdJNXyTmzR7S-MJl_zRuJzi9uPNS-AOwABAA4XUvACBCFKXGzKgGy1hM6bc2Oar7g8IE-WyyDxGMbRlRvN63tkAopBBeRuzRfLnQjsny-VCEi6xjFrW11fqYanvqu47-3HRRaQVy1zrfb1729x__9_4zG5AmKaf9gfHDwgV2PP-oqFfIt0ZpO5ewiIbmYeNaJDyZeLltZfSaJjaw
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Enhanced+residual+attention-based+subject-specific+network+%28ErAS-Net%29%3A+facial+expression-based+pain+classification+with+multiple+attention+mechanisms&rft.jtitle=Scientific+reports&rft.au=Mahdi+Morsali&rft.au=Aboozar+Ghaffari&rft.date=2025-06-03&rft.pub=Nature+Portfolio&rft.eissn=2045-2322&rft.volume=15&rft.issue=1&rft.spage=1&rft.epage=16&rft_id=info:doi/10.1038%2Fs41598-025-04552-w&rft.externalDBID=DOA&rft.externalDocID=oai_doaj_org_article_d863c2a3c7404719914543c80c17e2bf
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2045-2322&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2045-2322&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2045-2322&client=summon