Automatic Musical Composition System Based on Emotion Recognition by Face Images

The effect of music on human emotion has been studied for a long time. Research on emotions for music, for example the research on such as feelings and impressions when listening to music, has been established as one research field. However, although there were many studies that cause an emotion fro...

Full description

Saved in:
Bibliographic Details
Published inJournal of Japan Society for Fuzzy Theory and Intelligent Informatics Vol. 32; no. 6; pp. 975 - 986
Main Authors MAEDA, Yoichiro, FUJITA, Hibiki, KAMEI, Katsuari, COOPER, Eric W.
Format Journal Article
LanguageEnglish
Published Iizuka Japan Society for Fuzzy Theory and Intelligent Informatics 15.12.2020
Japan Science and Technology Agency
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The effect of music on human emotion has been studied for a long time. Research on emotions for music, for example the research on such as feelings and impressions when listening to music, has been established as one research field. However, although there were many studies that cause an emotion from a music, few researches on creating a music from an emotion have been performed.Therefore, in this study, we focus on facial expressions as emotional representation and aim to create a music that matches the emotion recognized from a facial image. For example, the system, which generates a bright and pleasant music using a laughing face image, or a dark and sad music using a crying face image automatically, will be constructed. Russell’s circumplex model was used for emotion recognition, and Hevner’s circular scale was used to generate music corresponding to these emotions. By using this system, for example, it will become possible to create a suitable BGM for the scene with only the actor’s face image in the production of movies. In this study, the above-mentioned system was constructed and the efficiency of this system was confirmed by conducting the Kansei evaluation experiment.
ISSN:1347-7986
1881-7203
DOI:10.3156/jsoft.32.6_975