Lightweight Single Image Super-Resolution With High-Continuity Attention

Window attention has become a popular choice in single image super-resolution (SISR) network design due to its efficient computation. However, its self-attention is restricted to fixed-size windows, leading to a lack of cross-window interaction. To address this, the benchmark SwinIR model adopts a s...

Full description

Saved in:
Bibliographic Details
Published inIEEE signal processing letters Vol. 32; pp. 2614 - 2618
Main Authors Zhang, Ju, Zhong, Baojiang, Ma, Kai-Kuang
Format Journal Article
LanguageEnglish
Published IEEE 2025
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Window attention has become a popular choice in single image super-resolution (SISR) network design due to its efficient computation. However, its self-attention is restricted to fixed-size windows, leading to a lack of cross-window interaction. To address this, the benchmark SwinIR model adopts a shifted window strategy to capture long-range dependencies. However, we observe that its attention still suffers from discontinuities at window boundaries, resulting in inferior SISR performance. To address this issue, we propose a new scale-dual attention (SDA) module, consisting of three parallel branches that integrate window attention and pooling attention via three complementary scales. This enables hierarchical local-global interactions, yielding high-continuity attention maps. To validate the effectiveness of our proposed SDA, we develop a lightweight scale-dual attention network (SDAN) with approximately 878 K parameters for SISR. Extensive experiments demonstrate that our SDAN achieves superior performance, outperforming state-of-the-art methods in both accuracy and efficiency.
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2025.3584016