The Moodo dataset: Integrating user context with emotional and color perception of music for affective music information retrieval

This paper presents a new multimodal dataset Moodo that can aid the development of affective music information retrieval systems. Moodo's main novelties are a multimodal approach that links emotional and color perception to music and the inclusion of user context. Analysis of the dataset reveal...

Full description

Saved in:
Bibliographic Details
Published inJournal of new music research Vol. 46; no. 3; pp. 246 - 260
Main Authors Pesek, Matevž, Strle, Gregor, Kavčič, Alenka, Marolt, Matija
Format Journal Article
LanguageEnglish
Published Abingdon Routledge 03.07.2017
Taylor & Francis Ltd
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper presents a new multimodal dataset Moodo that can aid the development of affective music information retrieval systems. Moodo's main novelties are a multimodal approach that links emotional and color perception to music and the inclusion of user context. Analysis of the dataset reveals notable differences in emotion-color associations and their valence-arousal ratings in non-music and music context. We also show differences in ratings of perceived and induced emotions, especially for those with perceived negative connotation, as well as the influence of genre and user context on perception of emotions. By applying an intermediate data fusion model, we demonstrate the importance of user profiles for predictive modeling in affective music information retrieval scenarios.
ISSN:0929-8215
1744-5027
DOI:10.1080/09298215.2017.1333518