Beyond Self-Attention: External Attention Using Two Linear Layers for Visual Tasks

Attention mechanisms, especially self-attention, have played an increasingly important role in deep feature representation for visual tasks. Self-attention updates the feature at each position by computing a weighted sum of features using pair-wise affinities across all positions to capture the long...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on pattern analysis and machine intelligence Vol. 45; no. 5; pp. 1 - 13
Main Authors Guo, Meng-Hao, Liu, Zheng-Ning, Mu, Tai-Jiang, Hu, Shi-Min
Format Journal Article
LanguageEnglish
Published United States IEEE 01.05.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…