Improving subject transfer in EEG classification with divergence estimation

Objective . Classification models for electroencephalogram (EEG) data show a large decrease in performance when evaluated on unseen test subjects. We improve performance using new regularization techniques during model training. Approach . We propose several graphical models to describe an EEG class...

Full description

Saved in:
Bibliographic Details
Published inJournal of neural engineering Vol. 21; no. 6; pp. 66031 - 66049
Main Authors Smedemark-Margulies, Niklas, Wang, Ye, Koike-Akino, Toshiaki, Liu, Jing, Parsons, Kieran, Bicer, Yunus, Erdoğmuş, Deniz
Format Journal Article
LanguageEnglish
Published England IOP Publishing 01.12.2024
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Objective . Classification models for electroencephalogram (EEG) data show a large decrease in performance when evaluated on unseen test subjects. We improve performance using new regularization techniques during model training. Approach . We propose several graphical models to describe an EEG classification task. From each model, we identify statistical relationships that should hold true in an idealized training scenario (with infinite data and a globally-optimal model) but that may not hold in practice. We design regularization penalties to enforce these relationships in two stages. First, we identify suitable proxy quantities (divergences such as Mutual Information and Wasserstein-1) that can be used to measure statistical independence and dependence relationships. Second, we provide algorithms to efficiently estimate these quantities during training using secondary neural network models. Main results . We conduct extensive computational experiments using a large benchmark EEG dataset, comparing our proposed techniques with a baseline method that uses an adversarial classifier. We first show the performance of each method across a wide range of hyperparameters, demonstrating that each method can be easily tuned to yield significant benefits over an unregularized model. We show that, using ideal hyperparameters for all methods, our first technique gives significantly better performance than the baseline regularization technique. We also show that, across hyperparameters, our second technique gives significantly more stable performance than the baseline. The proposed methods require only a small computational cost at training time that is equivalent to the cost of the baseline. Significance . The high variability in signal distribution between subjects means that typical approaches to EEG signal modeling often require time-intensive calibration for each user, and even re-calibration before every use. By improving the performance of population models in the most stringent case of zero-shot subject transfer, we may help reduce or eliminate the need for model calibration.
AbstractList . Classification models for electroencephalogram (EEG) data show a large decrease in performance when evaluated on unseen test subjects. We improve performance using new regularization techniques during model training. . We propose several graphical models to describe an EEG classification task. From each model, we identify statistical relationships that should hold true in an idealized training scenario (with infinite data and a globally-optimal model) but that may not hold in practice. We design regularization penalties to enforce these relationships in two stages. First, we identify suitable proxy quantities (divergences such as Mutual Information and Wasserstein-1) that can be used to measure statistical independence and dependence relationships. Second, we provide algorithms to efficiently estimate these quantities during training using secondary neural network models. . We conduct extensive computational experiments using a large benchmark EEG dataset, comparing our proposed techniques with a baseline method that uses an adversarial classifier. We first show the performance of each method across a wide range of hyperparameters, demonstrating that each method can be easily tuned to yield significant benefits over an unregularized model. We show that, using ideal hyperparameters for all methods, our first technique gives significantly better performance than the baseline regularization technique. We also show that, across hyperparameters, our second technique gives significantly more stable performance than the baseline. The proposed methods require only a small computational cost at training time that is equivalent to the cost of the baseline. . The high variability in signal distribution between subjects means that typical approaches to EEG signal modeling often require time-intensive calibration for each user, and even re-calibration before every use. By improving the performance of population models in the most stringent case of zero-shot subject transfer, we may help reduce or eliminate the need for model calibration.
Objective . Classification models for electroencephalogram (EEG) data show a large decrease in performance when evaluated on unseen test subjects. We improve performance using new regularization techniques during model training. Approach . We propose several graphical models to describe an EEG classification task. From each model, we identify statistical relationships that should hold true in an idealized training scenario (with infinite data and a globally-optimal model) but that may not hold in practice. We design regularization penalties to enforce these relationships in two stages. First, we identify suitable proxy quantities (divergences such as Mutual Information and Wasserstein-1) that can be used to measure statistical independence and dependence relationships. Second, we provide algorithms to efficiently estimate these quantities during training using secondary neural network models. Main results . We conduct extensive computational experiments using a large benchmark EEG dataset, comparing our proposed techniques with a baseline method that uses an adversarial classifier. We first show the performance of each method across a wide range of hyperparameters, demonstrating that each method can be easily tuned to yield significant benefits over an unregularized model. We show that, using ideal hyperparameters for all methods, our first technique gives significantly better performance than the baseline regularization technique. We also show that, across hyperparameters, our second technique gives significantly more stable performance than the baseline. The proposed methods require only a small computational cost at training time that is equivalent to the cost of the baseline. Significance . The high variability in signal distribution between subjects means that typical approaches to EEG signal modeling often require time-intensive calibration for each user, and even re-calibration before every use. By improving the performance of population models in the most stringent case of zero-shot subject transfer, we may help reduce or eliminate the need for model calibration.
Objective. Classification models for electroencephalogram (EEG) data show a large decrease in performance when evaluated on unseen test subjects. We improve performance using new regularization techniques during model training.Approach. We propose several graphical models to describe an EEG classification task. From each model, we identify statistical relationships that should hold true in an idealized training scenario (with infinite data and a globally-optimal model) but that may not hold in practice. We design regularization penalties to enforce these relationships in two stages. First, we identify suitable proxy quantities (divergences such as Mutual Information and Wasserstein-1) that can be used to measure statistical independence and dependence relationships. Second, we provide algorithms to efficiently estimate these quantities during training using secondary neural network models.Main results. We conduct extensive computational experiments using a large benchmark EEG dataset, comparing our proposed techniques with a baseline method that uses an adversarial classifier. We first show the performance of each method across a wide range of hyperparameters, demonstrating that each method can be easily tuned to yield significant benefits over an unregularized model. We show that, using ideal hyperparameters for all methods, our first technique gives significantly better performance than the baseline regularization technique. We also show that, across hyperparameters, our second technique gives significantly more stable performance than the baseline. The proposed methods require only a small computational cost at training time that is equivalent to the cost of the baseline.Significance. The high variability in signal distribution between subjects means that typical approaches to EEG signal modeling often require time-intensive calibration for each user, and even re-calibration before every use. By improving the performance of population models in the most stringent case of zero-shot subject transfer, we may help reduce or eliminate the need for model calibration.Objective. Classification models for electroencephalogram (EEG) data show a large decrease in performance when evaluated on unseen test subjects. We improve performance using new regularization techniques during model training.Approach. We propose several graphical models to describe an EEG classification task. From each model, we identify statistical relationships that should hold true in an idealized training scenario (with infinite data and a globally-optimal model) but that may not hold in practice. We design regularization penalties to enforce these relationships in two stages. First, we identify suitable proxy quantities (divergences such as Mutual Information and Wasserstein-1) that can be used to measure statistical independence and dependence relationships. Second, we provide algorithms to efficiently estimate these quantities during training using secondary neural network models.Main results. We conduct extensive computational experiments using a large benchmark EEG dataset, comparing our proposed techniques with a baseline method that uses an adversarial classifier. We first show the performance of each method across a wide range of hyperparameters, demonstrating that each method can be easily tuned to yield significant benefits over an unregularized model. We show that, using ideal hyperparameters for all methods, our first technique gives significantly better performance than the baseline regularization technique. We also show that, across hyperparameters, our second technique gives significantly more stable performance than the baseline. The proposed methods require only a small computational cost at training time that is equivalent to the cost of the baseline.Significance. The high variability in signal distribution between subjects means that typical approaches to EEG signal modeling often require time-intensive calibration for each user, and even re-calibration before every use. By improving the performance of population models in the most stringent case of zero-shot subject transfer, we may help reduce or eliminate the need for model calibration.
Author Smedemark-Margulies, Niklas
Bicer, Yunus
Wang, Ye
Liu, Jing
Parsons, Kieran
Koike-Akino, Toshiaki
Erdoğmuş, Deniz
Author_xml – sequence: 1
  givenname: Niklas
  orcidid: 0000-0002-4364-0273
  surname: Smedemark-Margulies
  fullname: Smedemark-Margulies, Niklas
  organization: Northeastern University Khoury College of Computer Sciences, Boston, MA, United States of America
– sequence: 2
  givenname: Ye
  orcidid: 0000-0001-5220-1830
  surname: Wang
  fullname: Wang, Ye
  organization: Mitsubishi Electric Research Labs (MERL) , Cambridge, MA, United States of America
– sequence: 3
  givenname: Toshiaki
  orcidid: 0000-0002-2578-5372
  surname: Koike-Akino
  fullname: Koike-Akino, Toshiaki
  organization: Mitsubishi Electric Research Labs (MERL) , Cambridge, MA, United States of America
– sequence: 4
  givenname: Jing
  surname: Liu
  fullname: Liu, Jing
  organization: Mitsubishi Electric Research Labs (MERL) , Cambridge, MA, United States of America
– sequence: 5
  givenname: Kieran
  surname: Parsons
  fullname: Parsons, Kieran
  organization: Mitsubishi Electric Research Labs (MERL) , Cambridge, MA, United States of America
– sequence: 6
  givenname: Yunus
  surname: Bicer
  fullname: Bicer, Yunus
  organization: Northeastern University Department of Electrical and Computer Engineering, Boston, MA, United States of America
– sequence: 7
  givenname: Deniz
  surname: Erdoğmuş
  fullname: Erdoğmuş, Deniz
  organization: Northeastern University Department of Electrical and Computer Engineering, Boston, MA, United States of America
BackLink https://www.ncbi.nlm.nih.gov/pubmed/39591745$$D View this record in MEDLINE/PubMed
BookMark eNp1kDtPwzAQgC0Eog_YmZBHBkLtOPFjRFUpFZVYYLZSP4qr1Cl2UsS_xyWlG9Od7r47n78ROPeNNwDcYPSAEecTzAqc5WWZTyotGGNnYHgqnZ9yigZgFOMGIYKZQJdgQEQpUrMcgpfFdheavfNrGLvVxqgWtqHy0ZoAnYez2RyquorRWaeq1jUefrn2A2q3N2FtvDLQxNZtf1tX4MJWdTTXxzgG70-zt-lztnydL6aPy0zlXLQZZowqhpAuLWaYFtpyy3OjGCNYc2o0Z9RQTawWyGJlubKY01IjkStKlCBjcNfvTZd_dul9uXVRmbquvGm6KAkmpMC8LFhCb49ot9oaLXch3Rq-5Z-ABKAeUKGJMRh7QjCSB8fyIFEehMrecRq570dcs5Obpgs-ffZ__Afu-nvV
CODEN JNEOBH
Cites_doi 10.1016/j.neuroimage.2022.119034
10.1016/j.neuroimage.2018.03.032
10.1109/TNNLS.2020.3010780
10.1109/LSP.2019.2906826
10.1080/2326263X.2017.1297192
10.1109/TIT.2010.2068870
10.1109/MCI.2015.2501545
10.1109/TCDS.2020.3007453
10.1167/15.6.4
10.1093/biomet/34.1-2.28
10.3389/fninf.2018.00078
10.1038/s41597-022-01509-w
10.1016/j.neucom.2020.09.017
10.1109/TBME.2021.3105331
10.3390/e22010096
10.1088/1741-2552/aa9817
10.3389/fnhum.2021.643386
10.3389/fncom.2019.00087
10.3389/fnins.2020.568000
10.3390/s20185083
ContentType Journal Article
Copyright 2024 IOP Publishing Ltd. All rights, including for text and data mining, AI training, and similar technologies, are reserved.
Copyright_xml – notice: 2024 IOP Publishing Ltd. All rights, including for text and data mining, AI training, and similar technologies, are reserved.
DBID AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
7X8
DOI 10.1088/1741-2552/ad9777
DatabaseName CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
MEDLINE - Academic
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
MEDLINE - Academic
DatabaseTitleList MEDLINE
CrossRef
MEDLINE - Academic
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
DeliveryMethod fulltext_linktorsrc
Discipline Anatomy & Physiology
EISSN 1741-2552
ExternalDocumentID 39591745
10_1088_1741_2552_ad9777
jnead9777
Genre Journal Article
GrantInformation_xml – fundername: Mitsubishi Electric Research Laboratories
  sequence: 0
  funderid: http://dx.doi.org/10.13039/100014462
GroupedDBID ---
1JI
4.4
53G
5B3
5GY
5VS
5ZH
7.M
7.Q
AAGCD
AAJIO
AAJKP
AATNI
ABHWH
ABJNI
ABQJV
ABVAM
ACAFW
ACGFS
ACHIP
AEFHF
AENEX
AFYNE
AKPSB
ALMA_UNASSIGNED_HOLDINGS
AOAED
ASPBG
ATQHT
AVWKF
AZFZN
CEBXE
CJUJL
CRLBU
CS3
DU5
EBS
EDWGO
EMSAF
EPQRW
EQZZN
F5P
HAK
IHE
IJHAN
IOP
IZVLO
KOT
LAP
N5L
N9A
P2P
PJBAE
RIN
RO9
ROL
RPA
SY9
W28
XPP
AAYXX
ADEQX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
7X8
AEINN
ID FETCH-LOGICAL-c289t-1776c700d5f17164df8f82ec7731d86ed876e6d3fd90f1cf8cf1865d092c63c93
IEDL.DBID IOP
ISSN 1741-2560
1741-2552
IngestDate Wed Jul 30 10:42:26 EDT 2025
Thu Jan 02 22:22:17 EST 2025
Tue Jul 01 01:48:13 EDT 2025
Wed Dec 18 01:53:27 EST 2024
IsPeerReviewed true
IsScholarly true
Issue 6
Keywords brain–computer interface (BCI)
electroencephalography (EEG)
subject transfer learning
domain adaptation
representation learning
Language English
License This article is available under the terms of the IOP-Standard License.
2024 IOP Publishing Ltd. All rights, including for text and data mining, AI training, and similar technologies, are reserved.
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c289t-1776c700d5f17164df8f82ec7731d86ed876e6d3fd90f1cf8cf1865d092c63c93
Notes JNE-106871.R3
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ORCID 0000-0002-4364-0273
0000-0002-2578-5372
0000-0001-5220-1830
PMID 39591745
PQID 3133418547
PQPubID 23479
PageCount 19
ParticipantIDs iop_journals_10_1088_1741_2552_ad9777
pubmed_primary_39591745
proquest_miscellaneous_3133418547
crossref_primary_10_1088_1741_2552_ad9777
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2024-12-01
PublicationDateYYYYMMDD 2024-12-01
PublicationDate_xml – month: 12
  year: 2024
  text: 2024-12-01
  day: 01
PublicationDecade 2020
PublicationPlace England
PublicationPlace_xml – name: England
PublicationTitle Journal of neural engineering
PublicationTitleAbbrev JNE
PublicationTitleAlternate J. Neural Eng
PublicationYear 2024
Publisher IOP Publishing
Publisher_xml – sequence: 0
  name: IOP Publishing
References Smedemark-Margulies (jnead9777bib17) 2023
Wan (jnead9777bib42) 2021; 421
Gupta (jnead9777bib44) 2022
Miyato (jnead9777bib51) 2018
Poole (jnead9777bib39) 2019
Lees (jnead9777bib16) 2018; 15
Saha (jnead9777bib5) 2020; 13
Ozair (jnead9777bib40) 2019; vol 32
Genevay (jnead9777bib61) 2018
Gibson (jnead9777bib3) 2022; 252
Wei (jnead9777bib7) 2018; 174
Han (jnead9777bib27) 2020
Šťastný (jnead9777bib6) 2014; 23
Pedregosa (jnead9777bib12) 2011; 12
Torres (jnead9777bib21) 2020; 20
Rubenstein (jnead9777bib58) 2019; vol 32
Yu (jnead9777bib11) 2020
Shachter (jnead9777bib43) 2013
Suzuki (jnead9777bib38) 2008
Congedo (jnead9777bib23) 2017; 4
Ma (jnead9777bib8) 2019
Jayaram (jnead9777bib22) 2016; 11
Levene (jnead9777bib56) 1960; 2
Rhodes (jnead9777bib49) 2020; vol 33
Liu (jnead9777bib24) 2021; 69
Gretton (jnead9777bib60) 2012; 13
Özdenizci (jnead9777bib26) 2019
Paszke (jnead9777bib52) 2019; vol 32
Villani (jnead9777bib50) 2009; vol 338
Wu (jnead9777bib1) 2020; 14
Norcia (jnead9777bib19) 2015; 15
Ko (jnead9777bib62) 2021; 15
Liaw (jnead9777bib15) 2018
Tzeng (jnead9777bib30) 2017
Zhao (jnead9777bib36) 2020; 32
Zheng (jnead9777bib25) 2016
The PyTorch Lightning Team (jnead9777bib53) 2019
Zhang (jnead9777bib10) 2020; 14
Won (jnead9777bib18) 2022; 9
Nguyen (jnead9777bib46) 2010; 56
Smedemark-Margulies (jnead9777bib28) 2022
Sugiyama (jnead9777bib37); vol 1703
Pu (jnead9777bib48) 2017
Arjovsky (jnead9777bib41) 2017
Welch (jnead9777bib55) 1947; 34
Wang (jnead9777bib9) 2018
Wierzgała (jnead9777bib20) 2018; 12
Long (jnead9777bib31) 2018; vol 31
Nowozin (jnead9777bib47) 2016; vol 29
Lai (jnead9777bib2) 2018
Porbadnigk (jnead9777bib4) 2014
Vapnik (jnead9777bib45) 1991; vol 4
Ma (jnead9777bib33) 2019
Nasiri (jnead9777bib34) 2020
Akiba (jnead9777bib13) 2019
Tang (jnead9777bib35) 2020; 22
Ganin (jnead9777bib29) 2016; 17
Bergstra (jnead9777bib14) 2013; vol 13
Loshchilov (jnead9777bib54) 2017
Özdenizci (jnead9777bib32) 2019; 26
Rényi (jnead9777bib57) 1961; vol 4
Sreekumar (jnead9777bib59) 2022; 23
References_xml – year: 2018
  ident: jnead9777bib15
  article-title: Tune: a research platform for distributed model selection and training
– volume: vol 32
  year: 2019
  ident: jnead9777bib40
  article-title: Wasserstein dependency measure for representation learning
– start-page: pp 326
  year: 2018
  ident: jnead9777bib2
  article-title: Artifacts and noise removal for electroencephalogram (EEG): a literature review
– volume: 252
  year: 2022
  ident: jnead9777bib3
  article-title: EEG variability: task-driven or subject-driven signal of interest?
  publication-title: NeuroImage
  doi: 10.1016/j.neuroimage.2022.119034
– volume: 174
  start-page: 407
  year: 2018
  ident: jnead9777bib7
  article-title: A subject-transfer framework for obviating inter-and intra-subject variability in EEG-based drowsiness detection
  publication-title: NeuroImage
  doi: 10.1016/j.neuroimage.2018.03.032
– volume: vol 32
  year: 2019
  ident: jnead9777bib52
  article-title: Pytorch: an imperative style, high-performance deep learning library
– start-page: pp 2623
  year: 2019
  ident: jnead9777bib13
  article-title: Optuna: a next-generation hyperparameter optimization framework
– start-page: pp 30
  year: 2019
  ident: jnead9777bib8
  article-title: Reducing the subject variability of EEG signals with adversarial domain generalization
– volume: 32
  start-page: 535
  year: 2020
  ident: jnead9777bib36
  article-title: Deep representation-based domain adaptation for nonstationary EEG classification
  publication-title: IEEE Trans. Neural Netw. Learn. Syst.
  doi: 10.1109/TNNLS.2020.3010780
– year: 2018
  ident: jnead9777bib51
  article-title: Spectral normalization for generative adversarial networks
– volume: 26
  start-page: 710
  year: 2019
  ident: jnead9777bib32
  article-title: Adversarial deep learning in EEG biometrics
  publication-title: IEEE Signal Process. Lett.
  doi: 10.1109/LSP.2019.2906826
– volume: vol 1703
  start-page: 10
  ident: jnead9777bib37
  article-title: Density ratio estimation: a comprehensive review (statistical experiment and its related topics)
– volume: 2
  start-page: 278
  year: 1960
  ident: jnead9777bib56
  article-title: Robust tests for equality of variances
  publication-title: Contrib. Probab. Stat.
– year: 2019
  ident: jnead9777bib53
  article-title: PyTorch Lightning
– volume: 4
  start-page: 155
  year: 2017
  ident: jnead9777bib23
  article-title: Riemannian geometry for EEG-based brain-computer interfaces; a primer and a review
  publication-title: Brain Comput. Interfaces
  doi: 10.1080/2326263X.2017.1297192
– volume: vol 33
  start-page: pp 4905
  year: 2020
  ident: jnead9777bib49
  article-title: Telescoping density-ratio estimation
– start-page: p 30
  year: 2017
  ident: jnead9777bib48
  article-title: Adversarial symmetric variational autoencoder
– year: 2020
  ident: jnead9777bib11
  article-title: Hyper-parameter optimization: a review of algorithms and applications
– volume: 23
  start-page: 266
  year: 2014
  ident: jnead9777bib6
  article-title: Overcoming inter-subject variability in BCI using EEG-based identification
  publication-title: Radioengineering
– start-page: pp 1
  year: 2014
  ident: jnead9777bib4
  article-title: When brain and behavior disagree: tackling systematic label noise in EEG data with machine learning
– year: 2018
  ident: jnead9777bib9
  article-title: Invariant representations from adversarially censored autoencoders
– start-page: pp 214
  year: 2017
  ident: jnead9777bib41
  article-title: Wasserstein generative adversarial networks
– volume: vol 13
  start-page: p 20
  year: 2013
  ident: jnead9777bib14
  article-title: Hyperopt: a Python library for optimizing the hyperparameters of machine learning algorithms
– volume: 56
  start-page: 5847
  year: 2010
  ident: jnead9777bib46
  article-title: Estimating divergence functionals and the likelihood ratio by convex risk minimization
  publication-title: IEEE Trans. Inf. Theory
  doi: 10.1109/TIT.2010.2068870
– volume: vol 31
  year: 2018
  ident: jnead9777bib31
  article-title: Conditional adversarial domain adaptation
– volume: 11
  start-page: 20
  year: 2016
  ident: jnead9777bib22
  article-title: Transfer learning in brain-computer interfaces
  publication-title: IEEE Comput. Intell. Mag.
  doi: 10.1109/MCI.2015.2501545
– start-page: pp 1
  year: 2023
  ident: jnead9777bib17
  article-title: Recursive estimation of user intent from noninvasive electroencephalography using discriminative models
– volume: vol 29
  year: 2016
  ident: jnead9777bib47
  article-title: f-gan: training generative neural samplers using variational divergence minimization
– year: 2017
  ident: jnead9777bib54
  article-title: Decoupled
– volume: 14
  start-page: 4
  year: 2020
  ident: jnead9777bib1
  article-title: Transfer learning for EEG-based brain–computer interfaces: a review of progress made since 2016
  publication-title: IEEE Trans. Cogn. Dev. Syst.
  doi: 10.1109/TCDS.2020.3007453
– start-page: pp 207
  year: 2019
  ident: jnead9777bib26
  article-title: Transfer learning in brain-computer interfaces with adversarial variational autoencoders
– volume: 15
  start-page: 4
  year: 2015
  ident: jnead9777bib19
  article-title: The steady-state visual evoked potential in vision research: a review
  publication-title: J. Vision
  doi: 10.1167/15.6.4
– year: 2022
  ident: jnead9777bib44
  article-title: Understanding and improving the role of projection head in self-supervised learning
– volume: 23
  start-page: 5460
  year: 2022
  ident: jnead9777bib59
  article-title: Neural estimation of statistical divergences
  publication-title: J. Mach. Learn. Res.
– volume: 34
  start-page: 28
  year: 1947
  ident: jnead9777bib55
  article-title: The generalization of ‘student’s’ problem when several different population varlances are involved
  publication-title: Biometrika
  doi: 10.1093/biomet/34.1-2.28
– volume: 12
  start-page: 78
  year: 2018
  ident: jnead9777bib20
  article-title: Most popular signal processing methods in motor-imagery BCI: a review and meta-analysis
  publication-title: Front. Neuroinf.
  doi: 10.3389/fninf.2018.00078
– start-page: pp 422
  year: 2020
  ident: jnead9777bib27
  article-title: Disentangled adversarial transfer learning for physiological biosignals
– start-page: pp 5
  year: 2008
  ident: jnead9777bib38
  article-title: Approximating mutual information by maximum likelihood density ratio estimation
– volume: vol 4
  year: 1991
  ident: jnead9777bib45
  article-title: Principles of risk minimization for learning theory
– start-page: pp 2732
  year: 2016
  ident: jnead9777bib25
  article-title: Personalizing EEG-based affective models with transfer learning
– volume: 9
  start-page: 388
  year: 2022
  ident: jnead9777bib18
  article-title: EEG dataset for RSVP and P300 speller brain-computer interfaces
  publication-title: Sci. Data
  doi: 10.1038/s41597-022-01509-w
– volume: vol 32
  year: 2019
  ident: jnead9777bib58
  article-title: Practical and consistent estimation of f-divergences
– volume: 17
  start-page: 2096
  year: 2016
  ident: jnead9777bib29
  article-title: Domain-adversarial training of neural networks
  publication-title: J. Mach. Learn. Res.
– start-page: pp 457
  year: 2020
  ident: jnead9777bib34
  article-title: Attentive adversarial network for large-scale sleep staging
– volume: 421
  start-page: 1
  year: 2021
  ident: jnead9777bib42
  article-title: A review on transfer learning in EEG signal analysis
  publication-title: Neurocomputing
  doi: 10.1016/j.neucom.2020.09.017
– volume: 69
  start-page: 795
  year: 2021
  ident: jnead9777bib24
  article-title: Align and pool for EEG headset domain adaptation (alpha) to facilitate dry electrode based SSVEP-BCI
  publication-title: IEEE Trans. Biomed. Eng.
  doi: 10.1109/TBME.2021.3105331
– start-page: pp 3159
  year: 2022
  ident: jnead9777bib28
  article-title: Autotransfer: subject transfer learning with censored representations on biosignals data
– volume: 22
  start-page: 96
  year: 2020
  ident: jnead9777bib35
  article-title: Conditional adversarial domain adaptation neural network for motor imagery EEG decoding
  publication-title: Entropy
  doi: 10.3390/e22010096
– volume: vol 338
  year: 2009
  ident: jnead9777bib50
– start-page: pp 1608
  year: 2018
  ident: jnead9777bib61
  article-title: Learning generative models with sinkhorn divergences
– volume: 15
  year: 2018
  ident: jnead9777bib16
  article-title: A review of rapid serial visual presentation-based brain–computer interfaces
  publication-title: J. Neural Eng.
  doi: 10.1088/1741-2552/aa9817
– volume: 15
  year: 2021
  ident: jnead9777bib62
  article-title: A survey on deep learning-based short/zero-calibration approaches for EEG-based brain–computer interfaces
  publication-title: Front. Hum. Neurosci.
  doi: 10.3389/fnhum.2021.643386
– volume: 13
  start-page: 87
  year: 2020
  ident: jnead9777bib5
  article-title: Intra-and inter-subject variability in EEG-based sensorimotor brain computer interface: a review
  publication-title: Front. Comput. Neurosci.
  doi: 10.3389/fncom.2019.00087
– volume: 14
  year: 2020
  ident: jnead9777bib10
  article-title: A benchmark dataset for RSVP-based brain–computer interfaces
  publication-title: Front. Neurosci.
  doi: 10.3389/fnins.2020.568000
– year: 2013
  ident: jnead9777bib43
  article-title: Bayes-Ball: the rational pastime (for determining irrelevance and requisite information in belief networks and influence diagrams)
– volume: 20
  start-page: 5083
  year: 2020
  ident: jnead9777bib21
  article-title: EEG-based bci emotion recognition: a survey
  publication-title: Sensors
  doi: 10.3390/s20185083
– start-page: pp 7167
  year: 2017
  ident: jnead9777bib30
  article-title: Adversarial discriminative domain adaptation
– volume: vol 4
  start-page: pp 547
  year: 1961
  ident: jnead9777bib57
  article-title: On measures of entropy and information
– volume: 13
  start-page: 723
  year: 2012
  ident: jnead9777bib60
  article-title: A kernel two-sample test
  publication-title: J. Mach. Learn. Res.
– volume: 12
  start-page: 2825
  year: 2011
  ident: jnead9777bib12
  article-title: Scikit-learn: machine learning in Python
  publication-title: J. Mach. Learn. Res.
– start-page: pp 1
  year: 2019
  ident: jnead9777bib33
  article-title: Depersonalized cross-subject vigilance estimation with adversarial domain generalization
– start-page: pp 5171
  year: 2019
  ident: jnead9777bib39
  article-title: On variational bounds of mutual information
SSID ssj0031790
Score 2.4084482
Snippet Objective . Classification models for electroencephalogram (EEG) data show a large decrease in performance when evaluated on unseen test subjects. We improve...
. Classification models for electroencephalogram (EEG) data show a large decrease in performance when evaluated on unseen test subjects. We improve performance...
Objective. Classification models for electroencephalogram (EEG) data show a large decrease in performance when evaluated on unseen test subjects. We improve...
SourceID proquest
pubmed
crossref
iop
SourceType Aggregation Database
Index Database
Publisher
StartPage 66031
SubjectTerms Algorithms
Brain-Computer Interfaces
brain–computer interface (BCI)
domain adaptation
electroencephalography (EEG)
Electroencephalography - classification
Electroencephalography - methods
Humans
Neural Networks, Computer
representation learning
subject transfer learning
Title Improving subject transfer in EEG classification with divergence estimation
URI https://iopscience.iop.org/article/10.1088/1741-2552/ad9777
https://www.ncbi.nlm.nih.gov/pubmed/39591745
https://www.proquest.com/docview/3133418547
Volume 21
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1JS8QwFA7jePHivowbEVTw0LFt2jTFk8joqLgcFOYglGYpiNgRpz3or_e9pDOgqIi3Hh5N8l6W7y35QshuzjCckBtP5RIcFAYYTjKmvUQq5K8TQtpnOq-uef8-uhjEgxY5mtyFGb40W38XPh1RsFNhUxAnDgFDBx4g4fAw14BekikyzQTn-HzB-c3teBtmSD3lbkOiNPebHOV3f_h0Jk1Buz_DTXvsnM6Rh3GHXbXJU7euZFe9f-Fy_OeI5slsA0fpsRNdIC1TLpKl4xJc8ec3uk9tgaiNvC-Ry0kAgo5qifEbWlnYa17pY0l7vTOqEItj8ZG1N8UgL9VY-WEpPykyerirksvk_rR3d9L3mrcYPAUuWeUFScJV4vs6LpBgJ9KFKERoVJKwQAtuNOyqhmtW6NQvAlUIVQSCx9pPQ8WZStkKaZfD0qwRymNTGO6nUsC44zSWIpIKUIcBLz2Jue6Qg7E1shdHuZHZVLkQGWoqQ01lTlMdsgdKzZp1N_pFbmds0AzWDyZF8tIM61HGwElHAp8IZFadpSetsjROsY_rf2xlg8yEgHlctcsmaVevtdkCzFLJbTs3PwAPMuGM
linkProvider IOP Publishing
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV3fb9MwELbYkNBexmCMDTYwEkziIW1S147zOG0tLYXRByrtzcS_JDQtrdbkAf567uykEoihSbzl4ZKz72L7u_P5MyFvS4bphNIlptQQoDDAcJoxm-TaIH-dlDpc0_n5UkwWw49X_Kq95zSchVmu2qm_B4-RKDiasC2Ik33A0FkCSHjQLy2gl7y_sn6LPORMMCTPn36Zd1MxQ_qpeCIS3xBpu0_5t6_8ti5tge67IWdYesaPybeu0bHi5LrX1Lpnfv7B5_gfvdojuy0spWdR_Al54KqnZP-sgpD85gc9paFQNGTg98lsk4ig60ZjHofWAf66W_q9oqPRB2oQk2MRUvA7xWQvtVgBEqg_KTJ7xCOTz8hiPPp6PknaOxkSA6FZnWR5LkyeppZ7JNoZWi-9HDiT5yyzUjgLs6sTlnlbpD4zXhqfScFtWgyMYKZgB2S7WlbukFDBnXciLbSEvvOCaznUBtCHg2g958IekfedR9QqUm-osGUupUJrKbSWitY6Iu_AsKodf-t_yL3pnKpgHOHmSFm5ZbNWDIJ1JPIZgszz6O2NVlbwAtv44p5aXpNH84ux-jS9nL0kOwOAQbEA5phs17eNOwEYU-tX4Vf9BTp75vA
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Improving+subject+transfer+in+EEG+classification+with+divergence+estimation&rft.jtitle=Journal+of+neural+engineering&rft.au=Smedemark-Margulies%2C+Niklas&rft.au=Wang%2C+Ye&rft.au=Koike-Akino%2C+Toshiaki&rft.au=Liu%2C+Jing&rft.date=2024-12-01&rft.eissn=1741-2552&rft.volume=21&rft.issue=6&rft_id=info:doi/10.1088%2F1741-2552%2Fad9777&rft_id=info%3Apmid%2F39591745&rft.externalDocID=39591745
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1741-2560&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1741-2560&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1741-2560&client=summon