Context-Sensitive Single-Modality Image Emotion Analysis: A Unified Architecture from Dataset Construction to CNN Classification
Still image emotion recognition is receiving increasing attention in recent years due to the tremendous amount of social media content on the Web. Opinion mining, visual emotion analysis, search and retrieval are among the application areas to name a few. Works are published on the subject, offering...
Saved in:
Published in | Proceedings - International Conference on Image Processing pp. 1932 - 1936 |
---|---|
Main Authors | , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.10.2018
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Still image emotion recognition is receiving increasing attention in recent years due to the tremendous amount of social media content on the Web. Opinion mining, visual emotion analysis, search and retrieval are among the application areas to name a few. Works are published on the subject, offering methods to detect image sentiments, while others focus on extracting the true social signals, such as happiness and anger, among others. However "context-sensitive" emotion recognition has been by and large discarded in the literature so far. Moreover, the problem in the single-modal domain; i.e. using only still images, remains less attended. In this work, we introduce the largest dataset of images collected from the wild, UCF ER, labeled with emotion and context. We train a context-sensitive classifier to classify images based on both emotion and context, hence introducing the first single-modal context-sensitive emotion recognition CNN model trained on our newly constructed dataset. Relying on our categorical approach to emotion recognition, we claim and show that including context as part of a unified training process helps boost performance, while reducing dependency on cross-modality approaches. Experimental results demonstrate considerable boost in performance compared to state-of-the-art. |
---|---|
ISSN: | 2381-8549 |
DOI: | 10.1109/ICIP.2018.8451048 |