MMOD-COG: A Database for Multimodal Cognitive Load Classification

This paper presents a dataset for multimodal classification of cognitive load recorded on a sample of students. The cognitive load was induced by way of performing basic arithmetic tasks, while the multimodal aspect of the dataset comes in the form of both speech and physiological responses to those...

Full description

Saved in:
Bibliographic Details
Published in2019 11th International Symposium on Image and Signal Processing and Analysis (ISPA) pp. 15 - 20
Main Authors Mijic, Igor, Sarlija, Marko, Petrinovic, Davor
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.09.2019
Subjects
Online AccessGet full text
ISSN1849-2266
DOI10.1109/ISPA.2019.8868678

Cover

More Information
Summary:This paper presents a dataset for multimodal classification of cognitive load recorded on a sample of students. The cognitive load was induced by way of performing basic arithmetic tasks, while the multimodal aspect of the dataset comes in the form of both speech and physiological responses to those tasks. The goal of the dataset was two-fold: firstly to provide an alternative to existing cognitive load focused datasets, usually based around Stroop tasks or working memory tasks; and secondly to implement the cognitive load tasks in a way that would make the responses appropriate for both speech and physiological response analysis, ultimately making it multimodal. The paper also presents preliminary classification benchmarks, in which SVM classifiers were trained and evaluated solely on either speech or physiological signals and on combinations of the two. The multimodal nature of the classifiers may provide improvements on results on this inherently challenging machine learning problem because it provides more data about both the intra-participant and inter-participant differences in how cognitive load manifests itself in affective responses.
ISSN:1849-2266
DOI:10.1109/ISPA.2019.8868678