Siamese Sleep Transformer For Robust Sleep Stage Scoring With Self-knowledge Distillation and Selective Batch Sampling

In this paper, we propose a Siamese sleep transformer (SST) that effectively extracts features from singlechannel raw electroencephalogram signals for robust sleep stage scoring. Despite the significant advances in sleep stage scoring in the last few years, most of them mainly focused on the increme...

Full description

Saved in:
Bibliographic Details
Published in2023 11th International Winter Conference on Brain-Computer Interface (BCI) pp. 1 - 5
Main Authors Kwak, Heon-Gyu, Kweon, Young-Seok, Shin, Gi-Hwan
Format Conference Proceeding
LanguageEnglish
Published IEEE 20.02.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this paper, we propose a Siamese sleep transformer (SST) that effectively extracts features from singlechannel raw electroencephalogram signals for robust sleep stage scoring. Despite the significant advances in sleep stage scoring in the last few years, most of them mainly focused on the increment of model performance. However, other problems still exist: the bias of labels in datasets and the instability of model performance by repetitive training. To alleviate these problems, we propose the SST, a novel sleep stage scoring model with a selective batch sampling strategy and self-knowledge distillation. To evaluate how robust the model was to the bias of labels, we used different datasets for training and testing: the sleep heart health study and the Sleep-EDF datasets. In this condition, the SST showed competitive performance in sleep stage scoring. In addition, we demonstrated the effectiveness of the selective batch sampling strategy with a reduction of the standard deviation of performance by repetitive training. These results could show that SST extracted effective learning features against the bias of labels in datasets, and the selective batch sampling strategy worked for the model robustness in training.
ISSN:2572-7672
DOI:10.1109/BCI57258.2023.10078532