Attending to Emotional Narratives

Attention mechanisms in deep neural networks have achieved excellent performance on sequence-prediction tasks. Here, we show that these recently-proposed attention-based mechanisms-in particular, the Transformer with its parallelizable self-attention layers, and the Memory Fusion Network with attent...

Full description

Saved in:
Bibliographic Details
Published inInternational Conference on Affective Computing and Intelligent Interaction and workshops pp. 648 - 654
Main Authors Wu, Zhengxuan, Zhang, Xiyu, Zhi-Xuan, Tan, Zaki, Jamil, Ong, Desmond C.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.09.2019
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Attention mechanisms in deep neural networks have achieved excellent performance on sequence-prediction tasks. Here, we show that these recently-proposed attention-based mechanisms-in particular, the Transformer with its parallelizable self-attention layers, and the Memory Fusion Network with attention across modalities and time-also generalize well to multimodal time-series emotion recognition. Using a recently-introduced dataset of emotional autobiographical narratives, we adapt and apply these two attention mechanisms to predict emotional valence over time. Our models perform extremely well, in some cases reaching a performance comparable with human raters. We end with a discussion of the implications of attention mechanisms to affective computing.
ISSN:2156-8111
DOI:10.1109/ACII.2019.8925497