Dual stream neural networks for brain signal classification

Objective . The primary objective of this work is to develop a neural nework classifier for arbitrary collections of functional neuroimaging signals to be used in brain–computer interfaces (BCIs). Approach . We propose a dual stream neural network (DSNN) for the classification problem. The first str...

Full description

Saved in:
Bibliographic Details
Published inJournal of neural engineering Vol. 18; no. 1; p. 16006
Main Authors Kuang, Dongyang, Michoski, Craig
Format Journal Article
LanguageEnglish
Published England 01.02.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Objective . The primary objective of this work is to develop a neural nework classifier for arbitrary collections of functional neuroimaging signals to be used in brain–computer interfaces (BCIs). Approach . We propose a dual stream neural network (DSNN) for the classification problem. The first stream is an end-to-end classifier taking raw time-dependent signals as input and generating feature identification signatures from them. The second stream enhances the identified features from the first stream by adjoining a dynamic functional connectivity matrix aimed at incorporating nuanced multi-channel information during specified BCI tasks. Main results . The proposed DSNN classifier is benchmarked against three publicly available datasets, where the classifier demonstrates performance comparable to, or better than the state-of-art in each instance. An information theoretic examination of the trained network is also performed, utilizing various tools, to demonstrate how to glean interpretive insight into how the hidden layers of the network parse the underlying biological signals. Significance . The resulting DSNN is a subject-independent classifier that works for any collection of 1D functional neuroimaging signals, with the option of integrating domain specific information in the design.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1741-2560
1741-2552
1741-2552
DOI:10.1088/1741-2552/abc903