Beyond Self-Attention: External Attention Using Two Linear Layers for Visual Tasks
Attention mechanisms, especially self-attention, have played an increasingly important role in deep feature representation for visual tasks. Self-attention updates the feature at each position by computing a weighted sum of features using pair-wise affinities across all positions to capture the long...
Saved in:
Published in | IEEE transactions on pattern analysis and machine intelligence Vol. 45; no. 5; pp. 1 - 13 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.05.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Be the first to leave a comment!