ST-SHAP: A hierarchical and explainable attention network for emotional EEG representation learning and decoding

Emotion recognition using electroencephalogram (EEG) has become a research hotspot in the field of human–computer interaction, how to sufficiently learn complex spatial–temporal representations of emotional EEG data and obtain explainable model prediction results are still great challenges. In this...

Full description

Saved in:
Bibliographic Details
Published inJournal of neuroscience methods Vol. 414; p. 110317
Main Authors Miao, Minmin, Liang, Jin, Sheng, Zhenzhen, Liu, Wenzhe, Xu, Baoguo, Hu, Wenjun
Format Journal Article
LanguageEnglish
Published Netherlands Elsevier B.V 01.02.2025
Subjects
Online AccessGet full text
ISSN0165-0270
1872-678X
1872-678X
DOI10.1016/j.jneumeth.2024.110317

Cover

Loading…
Abstract Emotion recognition using electroencephalogram (EEG) has become a research hotspot in the field of human–computer interaction, how to sufficiently learn complex spatial–temporal representations of emotional EEG data and obtain explainable model prediction results are still great challenges. In this study, a novel hierarchical and explainable attention network ST-SHAP which combines the Swin Transformer (ST) and SHapley Additive exPlanations (SHAP) technique is proposed for automatic emotional EEG classification. Firstly, a 3D spatial–temporal feature of emotional EEG data is generated via frequency band filtering, temporal segmentation, spatial mapping, and interpolation to fully preserve important spatial–temporal-frequency characteristics. Secondly, a hierarchical attention network is devised to sufficiently learn an abstract spatial–temporal representation of emotional EEG and perform classification. Concretely, in this decoding model, the W-MSA module is used for modeling correlations within local windows, the SW-MSA module allows for information interactions between different local windows, and the patch merging module further facilitates local-to-global multiscale modeling. Finally, the SHAP method is utilized to discover important brain regions for emotion processing and improve the explainability of the Swin Transformer model. Two benchmark datasets, namely SEED and DREAMER, are used for classification performance evaluation. In the subject-dependent experiments, for SEED dataset, ST-SHAP achieves an average accuracy of 97.18%, while for DREAMER dataset, the average accuracy is 96.06% and 95.98% on arousal and valence dimension respectively. In addition, important brain regions that conform to prior knowledge of neurophysiology are discovered via a data-driven approach for both datasets. In terms of subject-dependent and subject-independent emotional EEG decoding accuracies, our method outperforms several closely related existing methods. These experimental results fully prove the effectiveness and superiority of our proposed algorithm. •A novel hierarchical attention network is designed for emotional EEG recognition.•We utilize global and local relationships of EEG for accurate emotion recognition.•SHAP algorithm is used to detect critical brain regions in emotion processing.•Extensive experiments demonstrate the effectiveness of the proposed ST-SHAP.
AbstractList Emotion recognition using electroencephalogram (EEG) has become a research hotspot in the field of human-computer interaction, how to sufficiently learn complex spatial-temporal representations of emotional EEG data and obtain explainable model prediction results are still great challenges. In this study, a novel hierarchical and explainable attention network ST-SHAP which combines the Swin Transformer (ST) and SHapley Additive exPlanations (SHAP) technique is proposed for automatic emotional EEG classification. Firstly, a 3D spatial-temporal feature of emotional EEG data is generated via frequency band filtering, temporal segmentation, spatial mapping, and interpolation to fully preserve important spatial-temporal-frequency characteristics. Secondly, a hierarchical attention network is devised to sufficiently learn an abstract spatial-temporal representation of emotional EEG and perform classification. Concretely, in this decoding model, the W-MSA module is used for modeling correlations within local windows, the SW-MSA module allows for information interactions between different local windows, and the patch merging module further facilitates local-to-global multiscale modeling. Finally, the SHAP method is utilized to discover important brain regions for emotion processing and improve the explainability of the Swin Transformer model. Two benchmark datasets, namely SEED and DREAMER, are used for classification performance evaluation. In the subject-dependent experiments, for SEED dataset, ST-SHAP achieves an average accuracy of 97.18%, while for DREAMER dataset, the average accuracy is 96.06% and 95.98% on arousal and valence dimension respectively. In addition, important brain regions that conform to prior knowledge of neurophysiology are discovered via a data-driven approach for both datasets. In terms of subject-dependent and subject-independent emotional EEG decoding accuracies, our method outperforms several closely related existing methods. These experimental results fully prove the effectiveness and superiority of our proposed algorithm.
Emotion recognition using electroencephalogram (EEG) has become a research hotspot in the field of human-computer interaction, how to sufficiently learn complex spatial-temporal representations of emotional EEG data and obtain explainable model prediction results are still great challenges.BACKGROUNDEmotion recognition using electroencephalogram (EEG) has become a research hotspot in the field of human-computer interaction, how to sufficiently learn complex spatial-temporal representations of emotional EEG data and obtain explainable model prediction results are still great challenges.In this study, a novel hierarchical and explainable attention network ST-SHAP which combines the Swin Transformer (ST) and SHapley Additive exPlanations (SHAP) technique is proposed for automatic emotional EEG classification. Firstly, a 3D spatial-temporal feature of emotional EEG data is generated via frequency band filtering, temporal segmentation, spatial mapping, and interpolation to fully preserve important spatial-temporal-frequency characteristics. Secondly, a hierarchical attention network is devised to sufficiently learn an abstract spatial-temporal representation of emotional EEG and perform classification. Concretely, in this decoding model, the W-MSA module is used for modeling correlations within local windows, the SW-MSA module allows for information interactions between different local windows, and the patch merging module further facilitates local-to-global multiscale modeling. Finally, the SHAP method is utilized to discover important brain regions for emotion processing and improve the explainability of the Swin Transformer model.NEW METHODIn this study, a novel hierarchical and explainable attention network ST-SHAP which combines the Swin Transformer (ST) and SHapley Additive exPlanations (SHAP) technique is proposed for automatic emotional EEG classification. Firstly, a 3D spatial-temporal feature of emotional EEG data is generated via frequency band filtering, temporal segmentation, spatial mapping, and interpolation to fully preserve important spatial-temporal-frequency characteristics. Secondly, a hierarchical attention network is devised to sufficiently learn an abstract spatial-temporal representation of emotional EEG and perform classification. Concretely, in this decoding model, the W-MSA module is used for modeling correlations within local windows, the SW-MSA module allows for information interactions between different local windows, and the patch merging module further facilitates local-to-global multiscale modeling. Finally, the SHAP method is utilized to discover important brain regions for emotion processing and improve the explainability of the Swin Transformer model.Two benchmark datasets, namely SEED and DREAMER, are used for classification performance evaluation. In the subject-dependent experiments, for SEED dataset, ST-SHAP achieves an average accuracy of 97.18%, while for DREAMER dataset, the average accuracy is 96.06% and 95.98% on arousal and valence dimension respectively. In addition, important brain regions that conform to prior knowledge of neurophysiology are discovered via a data-driven approach for both datasets.RESULTSTwo benchmark datasets, namely SEED and DREAMER, are used for classification performance evaluation. In the subject-dependent experiments, for SEED dataset, ST-SHAP achieves an average accuracy of 97.18%, while for DREAMER dataset, the average accuracy is 96.06% and 95.98% on arousal and valence dimension respectively. In addition, important brain regions that conform to prior knowledge of neurophysiology are discovered via a data-driven approach for both datasets.In terms of subject-dependent and subject-independent emotional EEG decoding accuracies, our method outperforms several closely related existing methods.COMPARISON WITH EXISTING METHODSIn terms of subject-dependent and subject-independent emotional EEG decoding accuracies, our method outperforms several closely related existing methods.These experimental results fully prove the effectiveness and superiority of our proposed algorithm.CONCLUSIONThese experimental results fully prove the effectiveness and superiority of our proposed algorithm.
Emotion recognition using electroencephalogram (EEG) has become a research hotspot in the field of human–computer interaction, how to sufficiently learn complex spatial–temporal representations of emotional EEG data and obtain explainable model prediction results are still great challenges. In this study, a novel hierarchical and explainable attention network ST-SHAP which combines the Swin Transformer (ST) and SHapley Additive exPlanations (SHAP) technique is proposed for automatic emotional EEG classification. Firstly, a 3D spatial–temporal feature of emotional EEG data is generated via frequency band filtering, temporal segmentation, spatial mapping, and interpolation to fully preserve important spatial–temporal-frequency characteristics. Secondly, a hierarchical attention network is devised to sufficiently learn an abstract spatial–temporal representation of emotional EEG and perform classification. Concretely, in this decoding model, the W-MSA module is used for modeling correlations within local windows, the SW-MSA module allows for information interactions between different local windows, and the patch merging module further facilitates local-to-global multiscale modeling. Finally, the SHAP method is utilized to discover important brain regions for emotion processing and improve the explainability of the Swin Transformer model. Two benchmark datasets, namely SEED and DREAMER, are used for classification performance evaluation. In the subject-dependent experiments, for SEED dataset, ST-SHAP achieves an average accuracy of 97.18%, while for DREAMER dataset, the average accuracy is 96.06% and 95.98% on arousal and valence dimension respectively. In addition, important brain regions that conform to prior knowledge of neurophysiology are discovered via a data-driven approach for both datasets. In terms of subject-dependent and subject-independent emotional EEG decoding accuracies, our method outperforms several closely related existing methods. These experimental results fully prove the effectiveness and superiority of our proposed algorithm. •A novel hierarchical attention network is designed for emotional EEG recognition.•We utilize global and local relationships of EEG for accurate emotion recognition.•SHAP algorithm is used to detect critical brain regions in emotion processing.•Extensive experiments demonstrate the effectiveness of the proposed ST-SHAP.
ArticleNumber 110317
Author Miao, Minmin
Xu, Baoguo
Liang, Jin
Sheng, Zhenzhen
Liu, Wenzhe
Hu, Wenjun
Author_xml – sequence: 1
  givenname: Minmin
  orcidid: 0000-0002-8437-2412
  surname: Miao
  fullname: Miao, Minmin
  email: 02746@zjhu.edu.cn
  organization: School of Information Engineering, Huzhou University, Huzhou 313000, China
– sequence: 2
  givenname: Jin
  surname: Liang
  fullname: Liang, Jin
  organization: School of Information Engineering, Huzhou University, Huzhou 313000, China
– sequence: 3
  givenname: Zhenzhen
  surname: Sheng
  fullname: Sheng, Zhenzhen
  organization: School of Information Engineering, Huzhou University, Huzhou 313000, China
– sequence: 4
  givenname: Wenzhe
  surname: Liu
  fullname: Liu, Wenzhe
  organization: School of Information Engineering, Huzhou University, Huzhou 313000, China
– sequence: 5
  givenname: Baoguo
  surname: Xu
  fullname: Xu, Baoguo
  organization: School of Instrument Science and Engineering, Southeast University, Nanjing, 210096, China
– sequence: 6
  givenname: Wenjun
  surname: Hu
  fullname: Hu, Wenjun
  email: huwenjun@zjhu.edu.cn
  organization: School of Information Engineering, Huzhou University, Huzhou 313000, China
BackLink https://www.ncbi.nlm.nih.gov/pubmed/39542109$$D View this record in MEDLINE/PubMed
BookMark eNqFkU9PGzEQxa0KVAL0KyAfe9nU9v7z9tQIpVAJqUiAxM2atceN0117a29o--3rEOiV02hGv_ekee-UHPngkZALzpac8ebTdrn1uBtx3iwFE9WSc1by9h1ZcNmKomnl4xFZZLAumGjZCTlNacsYqzrWvCcnZVdXgrNuQaa7--LuenX7ma7oxmGEqDdOw0DBG4p_pgGch35ACvOMfnbBU4_z7xB_UhsixTHsb5lfr69oxCliyhg8gwNC9M7_ePYyqIPJyzk5tjAk_PAyz8jD1_X95XVx8_3q2-XqptAlr-ei6ivQUtu2FyhNbUTdWgG2451tuO564CUIo4E1pu2rquN1D31r82KsFGVZnpGPB98phl87TLMaXdI4DOAx7JIquZBSNJWUGb14QXf9iEZN0Y0Q_6rXlDLQHAAdQ0oR7X-EM7WvQ23Vax1qX4c61JGFXw5CzJ8-5XhV0g69RuMi6lmZ4N6y-AdAK5fS
Cites_doi 10.3390/brainsci11081006
10.1109/JSEN.2022.3144317
10.3390/math12081180
10.3390/s22239480
10.1109/JAS.2022.105686
10.1109/TETC.2021.3087174
10.3390/math11061424
10.1016/j.physa.2022.127700
10.24963/ijcai.2018/216
10.1109/TCDS.2016.2587290
10.1016/j.engappai.2020.103975
10.3390/s20072034
10.3390/math10152819
10.1109/TCDS.2021.3051465
10.1016/j.inffus.2022.03.009
10.1109/ICCV48922.2021.00986
10.3390/app14020702
10.1109/LGRS.2023.3251652
10.1088/1741-2552/acb79e
10.1038/s41598-021-89414-x
10.3390/s22062346
10.1016/j.compbiomed.2023.106537
10.1016/j.compbiomed.2021.104696
10.1016/j.jelectrocard.2024.153783
10.1109/TCDS.2020.2999337
10.1109/TNSRE.2022.3230250
10.1145/2939672.2939778
10.1016/j.future.2021.01.010
10.1109/TNSRE.2019.2938295
10.1109/ICCVW54120.2021.00210
10.1109/ICCV.2017.74
10.1016/j.compbiomed.2022.105303
10.1109/TAFFC.2018.2817622
10.1109/JBHI.2017.2688239
10.1016/j.neucom.2020.07.072
10.1109/JBHI.2020.2995767
10.1109/TAMD.2015.2431497
10.1109/TAFFC.2019.2922912
10.1145/3394171.3413724
10.1016/j.bspc.2023.105422
10.1016/j.asoc.2024.111338
10.1109/TAFFC.2020.2994159
10.3389/fnins.2023.1055445
10.3389/fnbot.2019.00037
10.3390/math10173131
10.3390/e23081046
10.3390/atmos15070748
10.1016/j.bspc.2023.104799
10.1109/TAFFC.2022.3169001
10.1109/TAFFC.2020.3013711
10.3390/toxics12030177
10.1016/j.cmpb.2023.107927
10.1007/s10489-022-04228-2
10.1109/TCDS.2021.3071170
10.1016/j.bspc.2022.104141
10.1109/JBHI.2024.3404664
10.1016/j.compbiomed.2022.106463
10.3390/brainsci14030268
10.1038/s41598-024-55743-w
ContentType Journal Article
Copyright 2024 Elsevier B.V.
Copyright © 2024 Elsevier B.V. All rights reserved.
Copyright_xml – notice: 2024 Elsevier B.V.
– notice: Copyright © 2024 Elsevier B.V. All rights reserved.
DBID AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
7X8
DOI 10.1016/j.jneumeth.2024.110317
DatabaseName CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
MEDLINE - Academic
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
MEDLINE - Academic
DatabaseTitleList MEDLINE
MEDLINE - Academic

Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
DeliveryMethod fulltext_linktorsrc
Discipline Medicine
Anatomy & Physiology
EISSN 1872-678X
ExternalDocumentID 39542109
10_1016_j_jneumeth_2024_110317
S0165027024002620
Genre Journal Article
GroupedDBID ---
--K
--M
-~X
.55
.GJ
.~1
0R~
1B1
1RT
1~.
1~5
29L
4.4
457
4G.
53G
5GY
5RE
5VS
7-5
71M
8P~
9JM
AABNK
AACTN
AAEDT
AAEDW
AAIKJ
AAKOC
AALRI
AAOAW
AAQFI
AAQXK
AAXKI
AAXLA
AAXUO
ABCQJ
ABFNM
ABFRF
ABJNI
ABMAC
ABWVN
ABXDB
ACDAQ
ACGFO
ACGFS
ACIUM
ACRLP
ACRPL
ADBBV
ADEZE
ADMUD
ADNMO
AEBSH
AEFWE
AEKER
AENEX
AFJKZ
AFKWA
AFTJW
AFXIZ
AGHFR
AGUBO
AGWIK
AGYEJ
AHHHB
AIEXJ
AIKHN
AITUG
AJOXV
AKRWK
ALMA_UNASSIGNED_HOLDINGS
AMFUW
AMRAJ
ASPBG
AVWKF
AXJTR
AZFZN
BKOJK
BLXMC
CS3
DU5
EBS
EFJIC
EJD
EO8
EO9
EP2
EP3
F5P
FDB
FEDTE
FGOYB
FIRID
FNPLU
FYGXN
G-2
G-Q
GBLVA
HMQ
HVGLF
HZ~
IHE
J1W
K-O
KOM
L7B
M2V
M41
MO0
MOBAO
N9A
O-L
O9-
OAUVE
OZT
P-8
P-9
P2P
PC.
Q38
R2-
RIG
ROL
RPZ
SDF
SDG
SDP
SES
SEW
SNS
SPCBC
SSN
SSZ
T5K
WUQ
X7M
ZGI
~G-
AATTM
AAYWO
AAYXX
ACVFH
ADCNI
AEIPS
AEUPX
AFPUW
AGCQF
AGQPQ
AGRNS
AIGII
AIIUN
AKBMS
AKYEP
ANKPU
APXCP
BNPGV
CITATION
SSH
CGR
CUY
CVF
ECM
EIF
NPM
7X8
ID FETCH-LOGICAL-c315t-4b4ac8cf7b2e8d5d257f2af919f61c9ba13a2dca06d7b44915bab7fd7bdf82333
IEDL.DBID .~1
ISSN 0165-0270
1872-678X
IngestDate Thu Jul 10 20:04:56 EDT 2025
Wed Feb 19 02:17:58 EST 2025
Tue Jul 01 02:57:19 EDT 2025
Sat Dec 21 16:00:19 EST 2024
IsPeerReviewed true
IsScholarly true
Keywords Emotion recognition
Swin transformer
Self attention
Explainability
EEG
Language English
License Copyright © 2024 Elsevier B.V. All rights reserved.
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c315t-4b4ac8cf7b2e8d5d257f2af919f61c9ba13a2dca06d7b44915bab7fd7bdf82333
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ORCID 0000-0002-8437-2412
PMID 39542109
PQID 3128826488
PQPubID 23479
ParticipantIDs proquest_miscellaneous_3128826488
pubmed_primary_39542109
crossref_primary_10_1016_j_jneumeth_2024_110317
elsevier_sciencedirect_doi_10_1016_j_jneumeth_2024_110317
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate February 2025
2025-02-00
2025-Feb
20250201
PublicationDateYYYYMMDD 2025-02-01
PublicationDate_xml – month: 02
  year: 2025
  text: February 2025
PublicationDecade 2020
PublicationPlace Netherlands
PublicationPlace_xml – name: Netherlands
PublicationTitle Journal of neuroscience methods
PublicationTitleAlternate J Neurosci Methods
PublicationYear 2025
Publisher Elsevier B.V
Publisher_xml – name: Elsevier B.V
References Liu, Zhao, An, Zhao, Wang, Yan (b39) 2023; 85
Confalonieri, Coba, Wagner, Besold (b9) 2021; 11
Li, Yang, Li, Chen, Du (b29) 2020; 415
Sun, Wang, Zhao, Hao, Wang (b55) 2022; 10
Miao, Zheng, Xu, Yang, Hu (b42) 2023; 79
Jia, Ziyu, Lin, Youfang, Cai, Xiyang, Chen, Haobin, Gou, Haijun, Wang, Jing, 2020. SST-EmotionNet: Spatial-spectral-temporal based attention 3D dense network for EEG emotion recognition. In: Proceedings of the 28th ACM International Conference on Multimedia. pp. 2909–2917.
Ahmed, Sinha, Phadikar, Ghaderpour (b1) 2022; 22
Liu, Fu (b35) 2021; 119
Zheng (b72) 2016; 9
Zhang, Cui, Zhong (b68) 2023; 11
Dosovitskiy, Beyer, Kolesnikov, Weissenborn, Zhai, Unterthiner, Dehghani, Minderer, Heigold, Gelly (b13) 2020
Zhong, Wang, Miao (b75) 2020; 13
Li, Yang, Zheng, Wenming, Cui, Zhen, Zhang, Tong, Zong, Yuan, 2018. A Novel Neural Network Model based on Cerebral Hemispheric Asymmetry for EEG Emotion Recognition. In: IJCAI. pp. 1561–1567.
Peng, Zhao, Zhang, Xu, Kong (b45) 2023; 154
Awan, Usman, Khalid, Anwar, Alroobaea, Hussain, Almotiri, Ullah, Akram (b3) 2022; 22
Ke, Ma, Li, Lv, Zou (b24) 2024; 14
Li, Chai, Wang, Yang, Du (b25) 2021; 13
Katsigiannis, Ramzan (b23) 2017; 22
Papa, Russo, Amerini, Zhou (b44) 2024
Liang, Jingyun, Cao, Jiezhang, Sun, Guolei, Zhang, Kai, Van Gool, Luc, Timofte, Radu, 2021. Swinir: Image restoration using swin transformer. In: Proceedings of the IEEE/CVF International Conference on Computer Vision. pp. 1833–1844.
Lundberg, Lee (b40) 2017; vol. 30
Temenos, Temenos, Kaselimi, Doulamis, Doulamis (b56) 2023; 20
Song, Zheng, Liu, Gao (b52) 2022; 31
Li, Wang, Zhang, Liu, Song, Cheng, Chen (b27) 2022; 143
Hartikainen (b20) 2021; 11
Li, Li, Pan, Wang (b26) 2021; 15
Topic, Russo (b57) 2021; 24
Chen, Li, Wan, Xu, Bezerianos, Wang (b6) 2022; 71
Guo, Zhang, Fan, Shen, Peng (b19) 2024; 12
Liu, Yang (b38) 2021; 11
Ma, Tang, Fan, Huang, Mei, Ma (b41) 2022; 9
Fan, Xie, Tao, Li, Pei, Li, Lv (b15) 2024; 87
Vaswani, Shazeer, Parmar, Uszkoreit, Jones, Gomez, Kaiser, Polosukhin (b58) 2017; vol. 30
Mishra, Bhusnur, Mishra, Singh (b43) 2024
Wang, Nie, Lu (b59) 2011
Wang, Wang, Hu, Yin, Song (b61) 2022; 22
Guo, Cai, An, Chen, Ma, Wan, Gao (b18) 2022; 603
Liu, Qiu, Zheng, Lu (b37) 2021; 14
Islam, Andreev, Shusharina, Hramov (b21) 2022; 10
Xing, Li, Xu, Shu, Hu, Xu (b64) 2019; 13
Bărbulescu, Saliba (b4) 2024; 15
Selvaraju, Ramprasaath R., Cogswell, Michael, Das, Abhishek, Vedantam, Ramakrishna, Parikh, Devi, Batra, Dhruv, 2017. Grad-cam: Visual explanations from deep networks via gradient-based localization. In: Proceedings of the IEEE International Conference on Computer Vision. pp. 618–626.
Arakaki, Arechavala, Choy, Bautista, Bliss, Molloy, Wu, Shimojo, Jiang, Kleinman (b2) 2023; 17
Du, Ma, Zhang, Li, Lai, Zhao, Deng, Liu, Wang (b14) 2020; 13
Wei, Liu, Li, Cheng, Song, Chen (b62) 2023; 152
Zhong, Gu, Luo, Zeng, Liu (b74) 2023; 53
Song, Zheng, Liu, Zong, Cui, Li (b53) 2021; 10
Yao, Li, Ding, Wang, Zhao, Gong, Nan, Fu (b67) 2024; 14
Ding, Robinson, Zhang, Zeng, Guan (b11) 2022; 14
Wang, Song, Tao, Liotta, Yang, Li, Gao, Sun, Ge, Zhang (b60) 2022; 83
Budnik-Przybylska, Syty, Kaźmierczak, Przybylski, Doliński, Łabuda, Jasik, Kastrau, Di Fronso, Bertollo (b5) 2024; 14
Li, Wang, Zheng, Zong, Qi, Cui, Zhang, Song (b28) 2020; 13
Winter (b63) 2002; vol. 3
Zhao, Zhang, Zhu, You, Kuang, Sun (b71) 2019; 27
Devlin, Chang, Lee, Toutanova (b10) 2018
Li, Zhang, Wang, Wei, Dang (b31) 2023; 20
Cheng, Chen, Li, Liu, Song, Liu, Chen (b7) 2020; 25
Zhang, Wei, Zou, Fu (b69) 2020; 96
Yang, Wu, Fu, Chen (b66) 2018
Li, Zheng, Wang, Zong, Cui (b33) 2019; 13
Zheng, Lu (b73) 2015; 7
Cimtay, Ekmekcioglu (b8) 2020; 20
Zhao, Xu, He, Peng (b70) 2023
Raffel, Shazeer, Roberts, Lee, Narang, Matena, Zhou, Li, Liu (b46) 2020; 21
Saliba, Bărbulescu (b49) 2024; 12
Shen, Li, Liang, Zhao, Ma, Wu, Zhang, Zhang, Hu (b51) 2024; 28
Feutrill, Roughan (b16) 2021; 23
Ding, Tong, Zhang, Jiang, Li, Liang, Guan (b12) 2024
Xu, Pan, Zheng, Ouyang, Jia, Zeng (b65) 2024; 243
Garg, Verma, Singh (b17) 2024; 154
Liu, Ze, Lin, Yutong, Cao, Yue, Hu, Han, Wei, Yixuan, Zhang, Zheng, Lin, Stephen, Guo, Baining, 2021a. Swin transformer: Hierarchical vision transformer using shifted windows. In: Proceedings of the IEEE/CVF International Conference on Computer Vision. pp. 10012–10022.
Song, Zheng, Song, Cui (b54) 2018; 11
Li, Zhang, Cao, Timofte, Van Gool (b30) 2021
Rahman, Sarkar, Hossain, Hossain, Islam, Hossain, Quinn, Moni (b47) 2021; 136
Ribeiro, Marco Tulio, Singh, Sameer, Guestrin, Carlos, 2016. ” Why should i trust you?” Explaining the predictions of any classifier. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. pp. 1135–1144.
Yang (10.1016/j.jneumeth.2024.110317_b66) 2018
Awan (10.1016/j.jneumeth.2024.110317_b3) 2022; 22
Ding (10.1016/j.jneumeth.2024.110317_b12) 2024
10.1016/j.jneumeth.2024.110317_b32
Zhang (10.1016/j.jneumeth.2024.110317_b68) 2023; 11
10.1016/j.jneumeth.2024.110317_b34
10.1016/j.jneumeth.2024.110317_b36
Saliba (10.1016/j.jneumeth.2024.110317_b49) 2024; 12
Chen (10.1016/j.jneumeth.2024.110317_b6) 2022; 71
Li (10.1016/j.jneumeth.2024.110317_b28) 2020; 13
Li (10.1016/j.jneumeth.2024.110317_b33) 2019; 13
Zhang (10.1016/j.jneumeth.2024.110317_b69) 2020; 96
Zheng (10.1016/j.jneumeth.2024.110317_b73) 2015; 7
Papa (10.1016/j.jneumeth.2024.110317_b44) 2024
Wang (10.1016/j.jneumeth.2024.110317_b61) 2022; 22
Ding (10.1016/j.jneumeth.2024.110317_b11) 2022; 14
Zheng (10.1016/j.jneumeth.2024.110317_b72) 2016; 9
Islam (10.1016/j.jneumeth.2024.110317_b21) 2022; 10
Song (10.1016/j.jneumeth.2024.110317_b53) 2021; 10
Ahmed (10.1016/j.jneumeth.2024.110317_b1) 2022; 22
Xu (10.1016/j.jneumeth.2024.110317_b65) 2024; 243
Devlin (10.1016/j.jneumeth.2024.110317_b10) 2018
10.1016/j.jneumeth.2024.110317_b22
Li (10.1016/j.jneumeth.2024.110317_b30) 2021
Liu (10.1016/j.jneumeth.2024.110317_b38) 2021; 11
Cimtay (10.1016/j.jneumeth.2024.110317_b8) 2020; 20
Song (10.1016/j.jneumeth.2024.110317_b52) 2022; 31
Xing (10.1016/j.jneumeth.2024.110317_b64) 2019; 13
Li (10.1016/j.jneumeth.2024.110317_b26) 2021; 15
Yao (10.1016/j.jneumeth.2024.110317_b67) 2024; 14
Winter (10.1016/j.jneumeth.2024.110317_b63) 2002; vol. 3
Zhao (10.1016/j.jneumeth.2024.110317_b71) 2019; 27
Guo (10.1016/j.jneumeth.2024.110317_b18) 2022; 603
Bărbulescu (10.1016/j.jneumeth.2024.110317_b4) 2024; 15
Zhong (10.1016/j.jneumeth.2024.110317_b74) 2023; 53
Garg (10.1016/j.jneumeth.2024.110317_b17) 2024; 154
Guo (10.1016/j.jneumeth.2024.110317_b19) 2024; 12
Temenos (10.1016/j.jneumeth.2024.110317_b56) 2023; 20
Cheng (10.1016/j.jneumeth.2024.110317_b7) 2020; 25
Du (10.1016/j.jneumeth.2024.110317_b14) 2020; 13
Li (10.1016/j.jneumeth.2024.110317_b31) 2023; 20
Wang (10.1016/j.jneumeth.2024.110317_b60) 2022; 83
Zhong (10.1016/j.jneumeth.2024.110317_b75) 2020; 13
Wang (10.1016/j.jneumeth.2024.110317_b59) 2011
Ma (10.1016/j.jneumeth.2024.110317_b41) 2022; 9
Feutrill (10.1016/j.jneumeth.2024.110317_b16) 2021; 23
Hartikainen (10.1016/j.jneumeth.2024.110317_b20) 2021; 11
Liu (10.1016/j.jneumeth.2024.110317_b39) 2023; 85
10.1016/j.jneumeth.2024.110317_b50
Mishra (10.1016/j.jneumeth.2024.110317_b43) 2024
Song (10.1016/j.jneumeth.2024.110317_b54) 2018; 11
Li (10.1016/j.jneumeth.2024.110317_b27) 2022; 143
Miao (10.1016/j.jneumeth.2024.110317_b42) 2023; 79
Topic (10.1016/j.jneumeth.2024.110317_b57) 2021; 24
Katsigiannis (10.1016/j.jneumeth.2024.110317_b23) 2017; 22
Budnik-Przybylska (10.1016/j.jneumeth.2024.110317_b5) 2024; 14
Ke (10.1016/j.jneumeth.2024.110317_b24) 2024; 14
Liu (10.1016/j.jneumeth.2024.110317_b35) 2021; 119
Confalonieri (10.1016/j.jneumeth.2024.110317_b9) 2021; 11
Li (10.1016/j.jneumeth.2024.110317_b29) 2020; 415
Rahman (10.1016/j.jneumeth.2024.110317_b47) 2021; 136
Arakaki (10.1016/j.jneumeth.2024.110317_b2) 2023; 17
Sun (10.1016/j.jneumeth.2024.110317_b55) 2022; 10
Shen (10.1016/j.jneumeth.2024.110317_b51) 2024; 28
Zhao (10.1016/j.jneumeth.2024.110317_b70) 2023
Fan (10.1016/j.jneumeth.2024.110317_b15) 2024; 87
Liu (10.1016/j.jneumeth.2024.110317_b37) 2021; 14
Raffel (10.1016/j.jneumeth.2024.110317_b46) 2020; 21
Li (10.1016/j.jneumeth.2024.110317_b25) 2021; 13
Lundberg (10.1016/j.jneumeth.2024.110317_b40) 2017; vol. 30
Peng (10.1016/j.jneumeth.2024.110317_b45) 2023; 154
10.1016/j.jneumeth.2024.110317_b48
Dosovitskiy (10.1016/j.jneumeth.2024.110317_b13) 2020
Vaswani (10.1016/j.jneumeth.2024.110317_b58) 2017; vol. 30
Wei (10.1016/j.jneumeth.2024.110317_b62) 2023; 152
References_xml – volume: 119
  start-page: 1
  year: 2021
  end-page: 6
  ident: b35
  article-title: Emotion recognition by deeply learned multi-channel textual and EEG features
  publication-title: Future Gener. Comput. Syst.
– volume: 22
  start-page: 98
  year: 2017
  end-page: 107
  ident: b23
  article-title: DREAMER: A database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices
  publication-title: IEEE J. Biomed. Health Inform.
– volume: 85
  year: 2023
  ident: b39
  article-title: GLFANet: A global to local feature aggregation network for EEG emotion recognition
  publication-title: Biomed. Signal Process. Control.
– volume: 603
  year: 2022
  ident: b18
  article-title: A transformer based neural network for emotion recognition and visualizations of crucial EEG channels
  publication-title: Phys. A
– volume: 25
  start-page: 453
  year: 2020
  end-page: 464
  ident: b7
  article-title: Emotion recognition from multi-channel EEG via deep forest
  publication-title: IEEE J. Biomed. Health Inf.
– year: 2020
  ident: b13
  article-title: An image is worth 16x16 words: Transformers for image recognition at scale
– volume: 21
  start-page: 5485
  year: 2020
  end-page: 5551
  ident: b46
  article-title: Exploring the limits of transfer learning with a unified text-to-text transformer
  publication-title: J. Mach. Learn. Res.
– volume: 13
  start-page: 568
  year: 2019
  end-page: 578
  ident: b33
  article-title: From regional to global brain: A novel hierarchical spatial-temporal neural network model for EEG emotion recognition
  publication-title: IEEE Trans. Affect. Comput.
– volume: 12
  start-page: 177
  year: 2024
  ident: b49
  article-title: Downscaling MERRA-2 reanalysis PM2. 5 series over the Arabian Gulf by inverse distance weighting, bicubic spline smoothing, and spatio-temporal kriging
  publication-title: Toxics
– start-page: 734
  year: 2011
  end-page: 743
  ident: b59
  article-title: EEG-based emotion recognition using frequency domain features and support vector machines
  publication-title: Neural Information Processing: 18th International Conference, ICONIP 2011, Shanghai, China, November 13-17, 2011, Proceedings, Part I 18
– volume: 154
  year: 2024
  ident: b17
  article-title: EEG-based emotion recognition using MobileNet recurrent neural network with time-frequency features
  publication-title: Appl. Soft Comput.
– reference: Selvaraju, Ramprasaath R., Cogswell, Michael, Das, Abhishek, Vedantam, Ramakrishna, Parikh, Devi, Batra, Dhruv, 2017. Grad-cam: Visual explanations from deep networks via gradient-based localization. In: Proceedings of the IEEE International Conference on Computer Vision. pp. 618–626.
– volume: 152
  year: 2023
  ident: b62
  article-title: TC-Net: A transformer capsule network for EEG-based emotion recognition
  publication-title: Comput. Biol. Med.
– volume: 143
  year: 2022
  ident: b27
  article-title: Emotion recognition from EEG based on multi-task learning with capsule network and attention mechanism
  publication-title: Comput. Biol. Med.
– volume: 10
  start-page: 1399
  year: 2021
  end-page: 1413
  ident: b53
  article-title: Graph-embedded convolutional neural network for image-based EEG emotion recognition
  publication-title: IEEE Trans. Emerg. Top. Comput.
– volume: 53
  start-page: 15278
  year: 2023
  end-page: 15294
  ident: b74
  article-title: Bi-hemisphere asymmetric attention network: Recognizing emotion from EEG signals based on the transformer
  publication-title: Appl. Intell.
– volume: 9
  start-page: 281
  year: 2016
  end-page: 290
  ident: b72
  article-title: Multichannel EEG-based emotion recognition via group sparse canonical correlation analysis
  publication-title: IEEE Trans. Cogn. Dev. Syst.
– reference: Li, Yang, Zheng, Wenming, Cui, Zhen, Zhang, Tong, Zong, Yuan, 2018. A Novel Neural Network Model based on Cerebral Hemispheric Asymmetry for EEG Emotion Recognition. In: IJCAI. pp. 1561–1567.
– volume: 154
  year: 2023
  ident: b45
  article-title: Temporal relative transformer encoding cooperating with channel attention for EEG emotion analysis
  publication-title: Comput. Biol. Med.
– volume: 20
  year: 2023
  ident: b31
  article-title: Emotion recognition using spatial-temporal EEG features through convolutional graph attention network
  publication-title: J. Neural Eng.
– volume: 22
  start-page: 9480
  year: 2022
  ident: b3
  article-title: An ensemble learning method for emotion charting using multimodal physiological signals
  publication-title: Sensors
– volume: 136
  year: 2021
  ident: b47
  article-title: Recognition of human emotions using EEG signals: A review
  publication-title: Comput. Biol. Med.
– volume: 14
  start-page: 2238
  year: 2022
  end-page: 2250
  ident: b11
  article-title: Tsception: Capturing temporal dynamics and spatial asymmetry from EEG for emotion recognition
  publication-title: IEEE Trans. Affect. Comput.
– volume: 14
  start-page: 702
  year: 2024
  ident: b24
  article-title: Multi-region and multi-band electroencephalogram emotion recognition based on self-attention and capsule network
  publication-title: Appl. Sci.
– volume: 31
  start-page: 710
  year: 2022
  end-page: 719
  ident: b52
  article-title: EEG conformer: Convolutional transformer for EEG decoding and visualization
  publication-title: IEEE Trans. Neural Syst. Rehabil. Eng.
– start-page: 433
  year: 2018
  end-page: 443
  ident: b66
  article-title: Continuous convolutional neural network with 3D input for EEG-based emotion recognition
  publication-title: Neural Information Processing: 25th International Conference, ICONIP 2018, Siem Reap, Cambodia, December 13–16, 2018, Proceedings, Part VII 25
– reference: Liang, Jingyun, Cao, Jiezhang, Sun, Guolei, Zhang, Kai, Van Gool, Luc, Timofte, Radu, 2021. Swinir: Image restoration using swin transformer. In: Proceedings of the IEEE/CVF International Conference on Computer Vision. pp. 1833–1844.
– volume: 10
  start-page: 3131
  year: 2022
  ident: b55
  article-title: Multi-channel EEG emotion recognition based on parallel transformer and 3D-convolutional neural network
  publication-title: Mathematics
– volume: 22
  start-page: 4359
  year: 2022
  end-page: 4368
  ident: b61
  article-title: Transformers for EEG-based emotion recognition: A hierarchical spatial information learning model
  publication-title: IEEE Sens. J.
– volume: 71
  start-page: 1
  year: 2022
  end-page: 15
  ident: b6
  article-title: Fusing frequency-domain features and brain connectivity features for cross-subject emotion recognition
  publication-title: IEEE Trans. Instrum. Meas.
– volume: vol. 30
  year: 2017
  ident: b40
  article-title: A unified approach to interpreting model predictions
  publication-title: Advances in Neural Information Processing Systems
– volume: vol. 3
  start-page: 2025
  year: 2002
  end-page: 2054
  ident: b63
  article-title: The shapley value
– volume: 23
  start-page: 1046
  year: 2021
  ident: b16
  article-title: A review of Shannon and differential entropy rate estimation
  publication-title: Entropy
– volume: 13
  start-page: 1290
  year: 2020
  end-page: 1301
  ident: b75
  article-title: EEG-based emotion recognition using regularized graph neural networks
  publication-title: IEEE Trans. Affect. Comput.
– reference: Ribeiro, Marco Tulio, Singh, Sameer, Guestrin, Carlos, 2016. ” Why should i trust you?” Explaining the predictions of any classifier. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. pp. 1135–1144.
– volume: vol. 30
  year: 2017
  ident: b58
  article-title: Attention is all you need
  publication-title: Advances in Neural Information Processing Systems
– year: 2023
  ident: b70
  article-title: Interpretable emotion classification using multi-domain feature of EEG signals
  publication-title: IEEE Sens. J.
– volume: 7
  start-page: 162
  year: 2015
  end-page: 175
  ident: b73
  article-title: Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks
  publication-title: IEEE Trans. Auton. Ment. Dev.
– year: 2021
  ident: b30
  article-title: Localvit: Bringing locality to vision transformers
– volume: 11
  start-page: 10758
  year: 2021
  ident: b38
  article-title: A three-branch 3D convolutional neural network for EEG-based different hand movement stages classification
  publication-title: Sci. Rep.
– volume: 17
  year: 2023
  ident: b2
  article-title: The connection between heart rate variability (HRV), neurological health, and cognition: A literature review
  publication-title: Front. Neurosci.
– volume: 22
  start-page: 2346
  year: 2022
  ident: b1
  article-title: Automated feature extraction on AsMap for emotion classification using EEG
  publication-title: Sensors
– volume: 13
  start-page: 354
  year: 2020
  end-page: 367
  ident: b28
  article-title: A novel bi-hemispheric discrepancy model for EEG emotion recognition
  publication-title: IEEE Trans. Cogn. Dev. Syst.
– volume: 87
  year: 2024
  ident: b15
  article-title: ICaps-ResLSTM: Improved capsule network and residual LSTM for EEG emotion recognition
  publication-title: Biomed. Signal Process. Control
– volume: 79
  year: 2023
  ident: b42
  article-title: A multiple frequency bands parallel spatial–temporal 3D deep residual learning framework for EEG-based emotion recognition
  publication-title: Biomed. Signal Process. Control
– volume: 15
  year: 2021
  ident: b26
  article-title: Cross-subject EEG emotion recognition with self-organized graph neural network
  publication-title: Front. Neurosci.
– volume: 14
  start-page: 715
  year: 2021
  end-page: 729
  ident: b37
  article-title: Comparing recognition performance and robustness of multimodal deep learning models for multimodal emotion recognition
  publication-title: IEEE Trans. Cogn. Dev. Syst.
– volume: 11
  start-page: 1424
  year: 2023
  ident: b68
  article-title: EEG-based emotion recognition via knowledge-integrated interpretable method
  publication-title: Mathematics
– year: 2018
  ident: b10
  article-title: Bert: Pre-training of deep bidirectional transformers for language understanding
– reference: Liu, Ze, Lin, Yutong, Cao, Yue, Hu, Han, Wei, Yixuan, Zhang, Zheng, Lin, Stephen, Guo, Baining, 2021a. Swin transformer: Hierarchical vision transformer using shifted windows. In: Proceedings of the IEEE/CVF International Conference on Computer Vision. pp. 10012–10022.
– volume: 243
  year: 2024
  ident: b65
  article-title: EESCN: A novel spiking neural network method for EEG-based emotion recognition
  publication-title: Comput. Methods Programs Biomed.
– volume: 10
  start-page: 2819
  year: 2022
  ident: b21
  article-title: Explainable machine learning methods for classification of brain states during visual perception
  publication-title: Mathematics
– volume: 415
  start-page: 225
  year: 2020
  end-page: 233
  ident: b29
  article-title: EEG-based intention recognition with deep recurrent-convolution neural network: Performance and channel selection by grad-CAM
  publication-title: Neurocomputing
– volume: 27
  start-page: 2164
  year: 2019
  end-page: 2177
  ident: b71
  article-title: A multi-branch 3D convolutional neural network for EEG-based motor imagery classification
  publication-title: IEEE Trans. Neural Syst. Rehabil. Eng.
– year: 2024
  ident: b43
  article-title: Comparative analysis of parametric B-spline and Hermite cubic spline based methods for accurate ECG signal modeling
  publication-title: J. Electrocardiol.
– volume: 14
  start-page: 5197
  year: 2024
  ident: b5
  article-title: Psychophysiological strategies for enhancing performance through imagery–skin conductance level analysis in guided vs. self-produced imagery
  publication-title: Sci. Rep.
– volume: 24
  start-page: 1442
  year: 2021
  end-page: 1454
  ident: b57
  article-title: Emotion recognition based on EEG feature maps through deep learning network
  publication-title: Eng. Sci. Technol., Int. J.
– volume: 14
  year: 2024
  ident: b67
  article-title: Emotion classification based on transformer and CNN for EEG spatial–temporal feature learning
  publication-title: Brain Sci.
– volume: 28
  start-page: 5247
  year: 2024
  end-page: 5259
  ident: b51
  article-title: HEMAsNet: A hemisphere asymmetry network inspired by the brain for depression recognition from electroencephalogram signals
  publication-title: IEEE J. Biomed. Health Inf.
– volume: 11
  year: 2021
  ident: b9
  article-title: A historical perspective of explainable artificial intelligence
  publication-title: Wiley Interdiscipl. Rev.: Data Min. Knowl. Discov.
– volume: 11
  start-page: 532
  year: 2018
  end-page: 541
  ident: b54
  article-title: EEG emotion recognition using dynamical graph convolutional neural networks
  publication-title: IEEE Trans. Affect. Comput.
– year: 2024
  ident: b12
  article-title: Emt: A novel transformer for generalized cross-subject EEG emotion recognition
– volume: 13
  start-page: 1528
  year: 2020
  end-page: 1540
  ident: b14
  article-title: An efficient LSTM network for emotion recognition from multichannel EEG signals
  publication-title: IEEE Trans. Affect. Comput.
– volume: 12
  year: 2024
  ident: b19
  article-title: A comprehensive interaction in multiscale multichannel EEG signals for emotion recognition
  publication-title: Mathematics
– volume: 11
  start-page: 1006
  year: 2021
  ident: b20
  article-title: Emotion-attention interaction in the right hemisphere
  publication-title: Brain Sci.
– volume: 20
  start-page: 1
  year: 2023
  end-page: 5
  ident: b56
  article-title: Interpretable deep learning framework for land use and land cover classification in remote sensing using SHAP
  publication-title: IEEE Geosci. Remote Sens. Lett.
– reference: Jia, Ziyu, Lin, Youfang, Cai, Xiyang, Chen, Haobin, Gou, Haijun, Wang, Jing, 2020. SST-EmotionNet: Spatial-spectral-temporal based attention 3D dense network for EEG emotion recognition. In: Proceedings of the 28th ACM International Conference on Multimedia. pp. 2909–2917.
– volume: 15
  start-page: 748
  year: 2024
  ident: b4
  article-title: Sensitivity analysis of the inverse distance weighting and bicubic spline smoothing models for MERRA-2 reanalysis PM2. 5 series in the Persian Gulf region
  publication-title: Atmosphere
– volume: 9
  start-page: 1200
  year: 2022
  end-page: 1217
  ident: b41
  article-title: SwinFusion: Cross-domain long-range learning for general image fusion via swin transformer
  publication-title: IEEE/CAA J. Autom. Sin.
– volume: 83
  start-page: 19
  year: 2022
  end-page: 52
  ident: b60
  article-title: A systematic review on affective computing: Emotion models, databases, and recent advances
  publication-title: Inf. Fusion
– volume: 96
  year: 2020
  ident: b69
  article-title: Automatic epileptic EEG classification based on differential entropy and attention model
  publication-title: Eng. Appl. Artif. Intell.
– volume: 20
  start-page: 2034
  year: 2020
  ident: b8
  article-title: Investigating the use of pretrained convolutional neural network on cross-subject and cross-dataset EEG emotion recognition
  publication-title: Sensors
– start-page: 1
  year: 2024
  end-page: 20
  ident: b44
  article-title: A survey on efficient vision transformers: Algorithms, techniques, and performance benchmarking
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
– volume: 13
  start-page: 37
  year: 2019
  ident: b64
  article-title: SAE+ LSTM: A new framework for emotion recognition from multi-channel EEG
  publication-title: Front. Neurorobot.
– volume: 13
  start-page: 885
  year: 2021
  end-page: 897
  ident: b25
  article-title: EEG emotion recognition based on 3-D feature representation and dilated fully convolutional networks
  publication-title: IEEE Trans. Cogn. Dev. Syst.
– year: 2023
  ident: 10.1016/j.jneumeth.2024.110317_b70
  article-title: Interpretable emotion classification using multi-domain feature of EEG signals
  publication-title: IEEE Sens. J.
– volume: 11
  start-page: 1006
  issue: 8
  year: 2021
  ident: 10.1016/j.jneumeth.2024.110317_b20
  article-title: Emotion-attention interaction in the right hemisphere
  publication-title: Brain Sci.
  doi: 10.3390/brainsci11081006
– volume: 22
  start-page: 4359
  issue: 5
  year: 2022
  ident: 10.1016/j.jneumeth.2024.110317_b61
  article-title: Transformers for EEG-based emotion recognition: A hierarchical spatial information learning model
  publication-title: IEEE Sens. J.
  doi: 10.1109/JSEN.2022.3144317
– volume: 12
  issue: 8
  year: 2024
  ident: 10.1016/j.jneumeth.2024.110317_b19
  article-title: A comprehensive interaction in multiscale multichannel EEG signals for emotion recognition
  publication-title: Mathematics
  doi: 10.3390/math12081180
– volume: 22
  start-page: 9480
  issue: 23
  year: 2022
  ident: 10.1016/j.jneumeth.2024.110317_b3
  article-title: An ensemble learning method for emotion charting using multimodal physiological signals
  publication-title: Sensors
  doi: 10.3390/s22239480
– volume: 9
  start-page: 1200
  issue: 7
  year: 2022
  ident: 10.1016/j.jneumeth.2024.110317_b41
  article-title: SwinFusion: Cross-domain long-range learning for general image fusion via swin transformer
  publication-title: IEEE/CAA J. Autom. Sin.
  doi: 10.1109/JAS.2022.105686
– volume: 10
  start-page: 1399
  issue: 3
  year: 2021
  ident: 10.1016/j.jneumeth.2024.110317_b53
  article-title: Graph-embedded convolutional neural network for image-based EEG emotion recognition
  publication-title: IEEE Trans. Emerg. Top. Comput.
  doi: 10.1109/TETC.2021.3087174
– volume: 11
  start-page: 1424
  issue: 6
  year: 2023
  ident: 10.1016/j.jneumeth.2024.110317_b68
  article-title: EEG-based emotion recognition via knowledge-integrated interpretable method
  publication-title: Mathematics
  doi: 10.3390/math11061424
– volume: 603
  year: 2022
  ident: 10.1016/j.jneumeth.2024.110317_b18
  article-title: A transformer based neural network for emotion recognition and visualizations of crucial EEG channels
  publication-title: Phys. A
  doi: 10.1016/j.physa.2022.127700
– ident: 10.1016/j.jneumeth.2024.110317_b32
  doi: 10.24963/ijcai.2018/216
– volume: 9
  start-page: 281
  issue: 3
  year: 2016
  ident: 10.1016/j.jneumeth.2024.110317_b72
  article-title: Multichannel EEG-based emotion recognition via group sparse canonical correlation analysis
  publication-title: IEEE Trans. Cogn. Dev. Syst.
  doi: 10.1109/TCDS.2016.2587290
– volume: 96
  year: 2020
  ident: 10.1016/j.jneumeth.2024.110317_b69
  article-title: Automatic epileptic EEG classification based on differential entropy and attention model
  publication-title: Eng. Appl. Artif. Intell.
  doi: 10.1016/j.engappai.2020.103975
– volume: 20
  start-page: 2034
  issue: 7
  year: 2020
  ident: 10.1016/j.jneumeth.2024.110317_b8
  article-title: Investigating the use of pretrained convolutional neural network on cross-subject and cross-dataset EEG emotion recognition
  publication-title: Sensors
  doi: 10.3390/s20072034
– volume: 10
  start-page: 2819
  issue: 15
  year: 2022
  ident: 10.1016/j.jneumeth.2024.110317_b21
  article-title: Explainable machine learning methods for classification of brain states during visual perception
  publication-title: Mathematics
  doi: 10.3390/math10152819
– volume: vol. 30
  year: 2017
  ident: 10.1016/j.jneumeth.2024.110317_b58
  article-title: Attention is all you need
– volume: 13
  start-page: 885
  issue: 4
  year: 2021
  ident: 10.1016/j.jneumeth.2024.110317_b25
  article-title: EEG emotion recognition based on 3-D feature representation and dilated fully convolutional networks
  publication-title: IEEE Trans. Cogn. Dev. Syst.
  doi: 10.1109/TCDS.2021.3051465
– start-page: 1
  year: 2024
  ident: 10.1016/j.jneumeth.2024.110317_b44
  article-title: A survey on efficient vision transformers: Algorithms, techniques, and performance benchmarking
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
– volume: 83
  start-page: 19
  year: 2022
  ident: 10.1016/j.jneumeth.2024.110317_b60
  article-title: A systematic review on affective computing: Emotion models, databases, and recent advances
  publication-title: Inf. Fusion
  doi: 10.1016/j.inffus.2022.03.009
– ident: 10.1016/j.jneumeth.2024.110317_b36
  doi: 10.1109/ICCV48922.2021.00986
– volume: 14
  start-page: 702
  issue: 2
  year: 2024
  ident: 10.1016/j.jneumeth.2024.110317_b24
  article-title: Multi-region and multi-band electroencephalogram emotion recognition based on self-attention and capsule network
  publication-title: Appl. Sci.
  doi: 10.3390/app14020702
– volume: 20
  start-page: 1
  year: 2023
  ident: 10.1016/j.jneumeth.2024.110317_b56
  article-title: Interpretable deep learning framework for land use and land cover classification in remote sensing using SHAP
  publication-title: IEEE Geosci. Remote Sens. Lett.
  doi: 10.1109/LGRS.2023.3251652
– volume: 20
  issue: 1
  year: 2023
  ident: 10.1016/j.jneumeth.2024.110317_b31
  article-title: Emotion recognition using spatial-temporal EEG features through convolutional graph attention network
  publication-title: J. Neural Eng.
  doi: 10.1088/1741-2552/acb79e
– volume: 11
  start-page: 10758
  issue: 1
  year: 2021
  ident: 10.1016/j.jneumeth.2024.110317_b38
  article-title: A three-branch 3D convolutional neural network for EEG-based different hand movement stages classification
  publication-title: Sci. Rep.
  doi: 10.1038/s41598-021-89414-x
– volume: 22
  start-page: 2346
  issue: 6
  year: 2022
  ident: 10.1016/j.jneumeth.2024.110317_b1
  article-title: Automated feature extraction on AsMap for emotion classification using EEG
  publication-title: Sensors
  doi: 10.3390/s22062346
– volume: 154
  year: 2023
  ident: 10.1016/j.jneumeth.2024.110317_b45
  article-title: Temporal relative transformer encoding cooperating with channel attention for EEG emotion analysis
  publication-title: Comput. Biol. Med.
  doi: 10.1016/j.compbiomed.2023.106537
– volume: vol. 30
  year: 2017
  ident: 10.1016/j.jneumeth.2024.110317_b40
  article-title: A unified approach to interpreting model predictions
– volume: 21
  start-page: 5485
  issue: 1
  year: 2020
  ident: 10.1016/j.jneumeth.2024.110317_b46
  article-title: Exploring the limits of transfer learning with a unified text-to-text transformer
  publication-title: J. Mach. Learn. Res.
– volume: 136
  year: 2021
  ident: 10.1016/j.jneumeth.2024.110317_b47
  article-title: Recognition of human emotions using EEG signals: A review
  publication-title: Comput. Biol. Med.
  doi: 10.1016/j.compbiomed.2021.104696
– year: 2024
  ident: 10.1016/j.jneumeth.2024.110317_b43
  article-title: Comparative analysis of parametric B-spline and Hermite cubic spline based methods for accurate ECG signal modeling
  publication-title: J. Electrocardiol.
  doi: 10.1016/j.jelectrocard.2024.153783
– volume: 13
  start-page: 354
  issue: 2
  year: 2020
  ident: 10.1016/j.jneumeth.2024.110317_b28
  article-title: A novel bi-hemispheric discrepancy model for EEG emotion recognition
  publication-title: IEEE Trans. Cogn. Dev. Syst.
  doi: 10.1109/TCDS.2020.2999337
– volume: 31
  start-page: 710
  year: 2022
  ident: 10.1016/j.jneumeth.2024.110317_b52
  article-title: EEG conformer: Convolutional transformer for EEG decoding and visualization
  publication-title: IEEE Trans. Neural Syst. Rehabil. Eng.
  doi: 10.1109/TNSRE.2022.3230250
– ident: 10.1016/j.jneumeth.2024.110317_b48
  doi: 10.1145/2939672.2939778
– volume: 119
  start-page: 1
  year: 2021
  ident: 10.1016/j.jneumeth.2024.110317_b35
  article-title: Emotion recognition by deeply learned multi-channel textual and EEG features
  publication-title: Future Gener. Comput. Syst.
  doi: 10.1016/j.future.2021.01.010
– volume: 27
  start-page: 2164
  issue: 10
  year: 2019
  ident: 10.1016/j.jneumeth.2024.110317_b71
  article-title: A multi-branch 3D convolutional neural network for EEG-based motor imagery classification
  publication-title: IEEE Trans. Neural Syst. Rehabil. Eng.
  doi: 10.1109/TNSRE.2019.2938295
– ident: 10.1016/j.jneumeth.2024.110317_b34
  doi: 10.1109/ICCVW54120.2021.00210
– ident: 10.1016/j.jneumeth.2024.110317_b50
  doi: 10.1109/ICCV.2017.74
– volume: 143
  year: 2022
  ident: 10.1016/j.jneumeth.2024.110317_b27
  article-title: Emotion recognition from EEG based on multi-task learning with capsule network and attention mechanism
  publication-title: Comput. Biol. Med.
  doi: 10.1016/j.compbiomed.2022.105303
– volume: 11
  start-page: 532
  issue: 3
  year: 2018
  ident: 10.1016/j.jneumeth.2024.110317_b54
  article-title: EEG emotion recognition using dynamical graph convolutional neural networks
  publication-title: IEEE Trans. Affect. Comput.
  doi: 10.1109/TAFFC.2018.2817622
– volume: 22
  start-page: 98
  issue: 1
  year: 2017
  ident: 10.1016/j.jneumeth.2024.110317_b23
  article-title: DREAMER: A database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices
  publication-title: IEEE J. Biomed. Health Inform.
  doi: 10.1109/JBHI.2017.2688239
– year: 2018
  ident: 10.1016/j.jneumeth.2024.110317_b10
– volume: 415
  start-page: 225
  year: 2020
  ident: 10.1016/j.jneumeth.2024.110317_b29
  article-title: EEG-based intention recognition with deep recurrent-convolution neural network: Performance and channel selection by grad-CAM
  publication-title: Neurocomputing
  doi: 10.1016/j.neucom.2020.07.072
– volume: 25
  start-page: 453
  issue: 2
  year: 2020
  ident: 10.1016/j.jneumeth.2024.110317_b7
  article-title: Emotion recognition from multi-channel EEG via deep forest
  publication-title: IEEE J. Biomed. Health Inf.
  doi: 10.1109/JBHI.2020.2995767
– volume: 71
  start-page: 1
  year: 2022
  ident: 10.1016/j.jneumeth.2024.110317_b6
  article-title: Fusing frequency-domain features and brain connectivity features for cross-subject emotion recognition
  publication-title: IEEE Trans. Instrum. Meas.
– year: 2021
  ident: 10.1016/j.jneumeth.2024.110317_b30
– volume: 7
  start-page: 162
  issue: 3
  year: 2015
  ident: 10.1016/j.jneumeth.2024.110317_b73
  article-title: Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks
  publication-title: IEEE Trans. Auton. Ment. Dev.
  doi: 10.1109/TAMD.2015.2431497
– volume: 13
  start-page: 568
  issue: 2
  year: 2019
  ident: 10.1016/j.jneumeth.2024.110317_b33
  article-title: From regional to global brain: A novel hierarchical spatial-temporal neural network model for EEG emotion recognition
  publication-title: IEEE Trans. Affect. Comput.
  doi: 10.1109/TAFFC.2019.2922912
– year: 2024
  ident: 10.1016/j.jneumeth.2024.110317_b12
– ident: 10.1016/j.jneumeth.2024.110317_b22
  doi: 10.1145/3394171.3413724
– volume: 15
  year: 2021
  ident: 10.1016/j.jneumeth.2024.110317_b26
  article-title: Cross-subject EEG emotion recognition with self-organized graph neural network
  publication-title: Front. Neurosci.
– volume: 87
  year: 2024
  ident: 10.1016/j.jneumeth.2024.110317_b15
  article-title: ICaps-ResLSTM: Improved capsule network and residual LSTM for EEG emotion recognition
  publication-title: Biomed. Signal Process. Control
  doi: 10.1016/j.bspc.2023.105422
– start-page: 433
  year: 2018
  ident: 10.1016/j.jneumeth.2024.110317_b66
  article-title: Continuous convolutional neural network with 3D input for EEG-based emotion recognition
– volume: 154
  year: 2024
  ident: 10.1016/j.jneumeth.2024.110317_b17
  article-title: EEG-based emotion recognition using MobileNet recurrent neural network with time-frequency features
  publication-title: Appl. Soft Comput.
  doi: 10.1016/j.asoc.2024.111338
– volume: 13
  start-page: 1290
  issue: 3
  year: 2020
  ident: 10.1016/j.jneumeth.2024.110317_b75
  article-title: EEG-based emotion recognition using regularized graph neural networks
  publication-title: IEEE Trans. Affect. Comput.
  doi: 10.1109/TAFFC.2020.2994159
– volume: 17
  year: 2023
  ident: 10.1016/j.jneumeth.2024.110317_b2
  article-title: The connection between heart rate variability (HRV), neurological health, and cognition: A literature review
  publication-title: Front. Neurosci.
  doi: 10.3389/fnins.2023.1055445
– volume: 13
  start-page: 37
  year: 2019
  ident: 10.1016/j.jneumeth.2024.110317_b64
  article-title: SAE+ LSTM: A new framework for emotion recognition from multi-channel EEG
  publication-title: Front. Neurorobot.
  doi: 10.3389/fnbot.2019.00037
– volume: 24
  start-page: 1442
  issue: 6
  year: 2021
  ident: 10.1016/j.jneumeth.2024.110317_b57
  article-title: Emotion recognition based on EEG feature maps through deep learning network
  publication-title: Eng. Sci. Technol., Int. J.
– volume: vol. 3
  start-page: 2025
  year: 2002
  ident: 10.1016/j.jneumeth.2024.110317_b63
  article-title: The shapley value
– volume: 10
  start-page: 3131
  issue: 17
  year: 2022
  ident: 10.1016/j.jneumeth.2024.110317_b55
  article-title: Multi-channel EEG emotion recognition based on parallel transformer and 3D-convolutional neural network
  publication-title: Mathematics
  doi: 10.3390/math10173131
– volume: 23
  start-page: 1046
  issue: 8
  year: 2021
  ident: 10.1016/j.jneumeth.2024.110317_b16
  article-title: A review of Shannon and differential entropy rate estimation
  publication-title: Entropy
  doi: 10.3390/e23081046
– volume: 15
  start-page: 748
  issue: 7
  year: 2024
  ident: 10.1016/j.jneumeth.2024.110317_b4
  article-title: Sensitivity analysis of the inverse distance weighting and bicubic spline smoothing models for MERRA-2 reanalysis PM2. 5 series in the Persian Gulf region
  publication-title: Atmosphere
  doi: 10.3390/atmos15070748
– volume: 85
  year: 2023
  ident: 10.1016/j.jneumeth.2024.110317_b39
  article-title: GLFANet: A global to local feature aggregation network for EEG emotion recognition
  publication-title: Biomed. Signal Process. Control.
  doi: 10.1016/j.bspc.2023.104799
– volume: 14
  start-page: 2238
  issue: 3
  year: 2022
  ident: 10.1016/j.jneumeth.2024.110317_b11
  article-title: Tsception: Capturing temporal dynamics and spatial asymmetry from EEG for emotion recognition
  publication-title: IEEE Trans. Affect. Comput.
  doi: 10.1109/TAFFC.2022.3169001
– year: 2020
  ident: 10.1016/j.jneumeth.2024.110317_b13
– volume: 11
  issue: 1
  year: 2021
  ident: 10.1016/j.jneumeth.2024.110317_b9
  article-title: A historical perspective of explainable artificial intelligence
  publication-title: Wiley Interdiscipl. Rev.: Data Min. Knowl. Discov.
– volume: 13
  start-page: 1528
  issue: 3
  year: 2020
  ident: 10.1016/j.jneumeth.2024.110317_b14
  article-title: An efficient LSTM network for emotion recognition from multichannel EEG signals
  publication-title: IEEE Trans. Affect. Comput.
  doi: 10.1109/TAFFC.2020.3013711
– volume: 12
  start-page: 177
  issue: 3
  year: 2024
  ident: 10.1016/j.jneumeth.2024.110317_b49
  article-title: Downscaling MERRA-2 reanalysis PM2. 5 series over the Arabian Gulf by inverse distance weighting, bicubic spline smoothing, and spatio-temporal kriging
  publication-title: Toxics
  doi: 10.3390/toxics12030177
– volume: 243
  year: 2024
  ident: 10.1016/j.jneumeth.2024.110317_b65
  article-title: EESCN: A novel spiking neural network method for EEG-based emotion recognition
  publication-title: Comput. Methods Programs Biomed.
  doi: 10.1016/j.cmpb.2023.107927
– volume: 53
  start-page: 15278
  issue: 12
  year: 2023
  ident: 10.1016/j.jneumeth.2024.110317_b74
  article-title: Bi-hemisphere asymmetric attention network: Recognizing emotion from EEG signals based on the transformer
  publication-title: Appl. Intell.
  doi: 10.1007/s10489-022-04228-2
– volume: 14
  start-page: 715
  issue: 2
  year: 2021
  ident: 10.1016/j.jneumeth.2024.110317_b37
  article-title: Comparing recognition performance and robustness of multimodal deep learning models for multimodal emotion recognition
  publication-title: IEEE Trans. Cogn. Dev. Syst.
  doi: 10.1109/TCDS.2021.3071170
– volume: 79
  year: 2023
  ident: 10.1016/j.jneumeth.2024.110317_b42
  article-title: A multiple frequency bands parallel spatial–temporal 3D deep residual learning framework for EEG-based emotion recognition
  publication-title: Biomed. Signal Process. Control
  doi: 10.1016/j.bspc.2022.104141
– volume: 28
  start-page: 5247
  issue: 9
  year: 2024
  ident: 10.1016/j.jneumeth.2024.110317_b51
  article-title: HEMAsNet: A hemisphere asymmetry network inspired by the brain for depression recognition from electroencephalogram signals
  publication-title: IEEE J. Biomed. Health Inf.
  doi: 10.1109/JBHI.2024.3404664
– volume: 152
  year: 2023
  ident: 10.1016/j.jneumeth.2024.110317_b62
  article-title: TC-Net: A transformer capsule network for EEG-based emotion recognition
  publication-title: Comput. Biol. Med.
  doi: 10.1016/j.compbiomed.2022.106463
– volume: 14
  issue: 3
  year: 2024
  ident: 10.1016/j.jneumeth.2024.110317_b67
  article-title: Emotion classification based on transformer and CNN for EEG spatial–temporal feature learning
  publication-title: Brain Sci.
  doi: 10.3390/brainsci14030268
– start-page: 734
  year: 2011
  ident: 10.1016/j.jneumeth.2024.110317_b59
  article-title: EEG-based emotion recognition using frequency domain features and support vector machines
– volume: 14
  start-page: 5197
  issue: 1
  year: 2024
  ident: 10.1016/j.jneumeth.2024.110317_b5
  article-title: Psychophysiological strategies for enhancing performance through imagery–skin conductance level analysis in guided vs. self-produced imagery
  publication-title: Sci. Rep.
  doi: 10.1038/s41598-024-55743-w
SSID ssj0004906
Score 2.4489202
Snippet Emotion recognition using electroencephalogram (EEG) has become a research hotspot in the field of human–computer interaction, how to sufficiently learn...
Emotion recognition using electroencephalogram (EEG) has become a research hotspot in the field of human-computer interaction, how to sufficiently learn...
SourceID proquest
pubmed
crossref
elsevier
SourceType Aggregation Database
Index Database
Publisher
StartPage 110317
SubjectTerms Adult
Attention - physiology
Brain - physiology
EEG
Electroencephalography - methods
Emotion recognition
Emotions - physiology
Explainability
Humans
Neural Networks, Computer
Self attention
Signal Processing, Computer-Assisted
Swin transformer
Title ST-SHAP: A hierarchical and explainable attention network for emotional EEG representation learning and decoding
URI https://dx.doi.org/10.1016/j.jneumeth.2024.110317
https://www.ncbi.nlm.nih.gov/pubmed/39542109
https://www.proquest.com/docview/3128826488
Volume 414
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV3daxQxEB9KBfFFauvH-VEiiG_b283HbeLbUq6eSotwLfQt5FN6aHrYO9AX_3Yz2V0_QPHBx12y2ZBJZuaX-U0G4EXILkMUIoMcr9qKO8oqE-u6sj5yb6Sva4NHA6dns8UFf3spLnfgeMyFQVrloPt7nV609fBmOszmdH11NV1iIk6N6VQcgQRF3M55i6v86NtPmgdXpb4mNsZ4Zf1LlvDqaJXCFis1Z5xIOTLiWSlc9kcD9TcHtBiikz24O3iQpOsHeQ92QtqHgy5l9PzpK3lJCqezHJbvw-3TIXR-AOvlebVcdO9fkY5g_esSQcgCIiZ5Er6sPw5pVAQv3CwUSJJ6ijjJfi0Jfbmf3H4-f03KVZhj2lIiQ-mJD6UvnwEtGsT7cHEyPz9eVEO5hcqxRmwqbrlx0sXW0iC98HkzR2qialScNU5Z0zBDvTP1zLeWc9UIa2wb84OPkjLGHsBuuk7hERDHmG1o61zMcCsIK6VqHHYlhVROsQlMxznW6_5WDT3SzVZ6lIpGqeheKhNQoyj0b-tDZ9X_z2-fj7LTefNgRMSkcL290SxbZ4kcPzmBh71Qf4yHKcEzHlaP_-PPT-AOxXrBheX9FHY3n7fhWXZiNvawrNJDuNW9ebc4-w7ehPJB
linkProvider Elsevier
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1bb9MwFLamTgJeJti4lKuREG-hSWw3Nm_R1JGxtUJqJ-3N8hWtAq8arQT_Hh_H4SKBeOAxkeNYPvbx-XwuH0KvXDQZPGMR5FjRFNTUpFC-LAttPbWK27JUcDUwX0y7C_r-kl3uoeMhFwbCKrPu73V60tb5zSTP5mRzdTVZQiJOCelUFIBEHXH7PlSnYiO0356edYuf6ZEiUWxCe3BZlr8kCq_frIPbAVlzhIo1haB4krjL_nhG_c0GTWfRyV10kI1I3PbjvIf2XDhER22IAPrzN_wap7DOdF9-iG7Ns_f8CG2Wq2LZtR_e4hYDBXZyIkQZYRUsdl83n3ImFYaamykKEoc-ShxH0xa7nvEntp_N3uFUDXPIXAo4s098TH3ZiGnhTLyPLk5mq-OuyIwLhSEV2xZUU2W48Y2uHbfMxv3sa-VFJfy0MkKriqjaGlVObaMpFRXTSjc-PljPa0LIAzQK18E9QtgQoqu6McZHxOWY5lxUBrrijAsjyBhNhjmWm76whhwiztZykIoEqcheKmMkBlHI35aIjNr_n9--HGQn4_4Bp4gK7nr3RZJ4QHMI8-Nj9LAX6o_xEMFohMTi8X_8-QW63a3m5_L8dHH2BN2pgT44BX0_RaPtzc49izbNVj_Pa_Y7KpX08g
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=ST-SHAP%3A+A+hierarchical+and+explainable+attention+network+for+emotional+EEG+representation+learning+and+decoding&rft.jtitle=Journal+of+neuroscience+methods&rft.au=Miao%2C+Minmin&rft.au=Liang%2C+Jin&rft.au=Sheng%2C+Zhenzhen&rft.au=Liu%2C+Wenzhe&rft.date=2025-02-01&rft.eissn=1872-678X&rft.volume=414&rft.spage=110317&rft_id=info:doi/10.1016%2Fj.jneumeth.2024.110317&rft_id=info%3Apmid%2F39542109&rft.externalDocID=39542109
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0165-0270&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0165-0270&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0165-0270&client=summon