The Moodo dataset: Integrating user context with emotional and color perception of music for affective music information retrieval
This paper presents a new multimodal dataset Moodo that can aid the development of affective music information retrieval systems. Moodo's main novelties are a multimodal approach that links emotional and color perception to music and the inclusion of user context. Analysis of the dataset reveal...
Saved in:
Published in | Journal of new music research Vol. 46; no. 3; pp. 246 - 260 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Abingdon
Routledge
03.07.2017
Taylor & Francis Ltd |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | This paper presents a new multimodal dataset Moodo that can aid the development of affective music information retrieval systems. Moodo's main novelties are a multimodal approach that links emotional and color perception to music and the inclusion of user context. Analysis of the dataset reveals notable differences in emotion-color associations and their valence-arousal ratings in non-music and music context. We also show differences in ratings of perceived and induced emotions, especially for those with perceived negative connotation, as well as the influence of genre and user context on perception of emotions. By applying an intermediate data fusion model, we demonstrate the importance of user profiles for predictive modeling in affective music information retrieval scenarios. |
---|---|
ISSN: | 0929-8215 1744-5027 |
DOI: | 10.1080/09298215.2017.1333518 |