From Word Embedding to Reading Embedding Using Large Language Model, EEG and Eye-tracking
Reading comprehension, a fundamental cognitive ability essential for knowledge acquisition, is a complex skill, with a notable number of learners lacking proficiency in this domain. This study introduces innovative tasks for Brain-Computer Interface (BCI), predicting the relevance of words or tokens...
Saved in:
Main Authors | , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
28.01.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Reading comprehension, a fundamental cognitive ability essential for
knowledge acquisition, is a complex skill, with a notable number of learners
lacking proficiency in this domain. This study introduces innovative tasks for
Brain-Computer Interface (BCI), predicting the relevance of words or tokens
read by individuals to the target inference words. We use state-of-the-art
Large Language Models (LLMs) to guide a new reading embedding representation in
training. This representation, integrating EEG and eye-tracking biomarkers
through an attention-based transformer encoder, achieved a mean 5-fold
cross-validation accuracy of 68.7% across nine subjects using a balanced
sample, with the highest single-subject accuracy reaching 71.2%. This study
pioneers the integration of LLMs, EEG, and eye-tracking for predicting human
reading comprehension at the word level. We fine-tune the pre-trained
Bidirectional Encoder Representations from Transformers (BERT) model for word
embedding, devoid of information about the reading tasks. Despite this absence
of task-specific details, the model effortlessly attains an accuracy of 92.7%,
thereby validating our findings from LLMs. This work represents a preliminary
step toward developing tools to assist reading. |
---|---|
DOI: | 10.48550/arxiv.2401.15681 |