Attending to Emotional Narratives

Attention mechanisms in deep neural networks have achieved excellent performance on sequence-prediction tasks. Here, we show that these recently-proposed attention-based mechanisms-in particular, the Transformer with its parallelizable self-attention layers, and the Memory Fusion Network with attent...

Full description

Saved in:
Bibliographic Details
Published inInternational Conference on Affective Computing and Intelligent Interaction and workshops pp. 648 - 654
Main Authors Wu, Zhengxuan, Zhang, Xiyu, Zhi-Xuan, Tan, Zaki, Jamil, Ong, Desmond C.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.09.2019
Subjects
Online AccessGet full text

Cover

Loading…