Residual Local Feature Network for Efficient Super-Resolution
Deep learning based approaches has achieved great performance in single image super-resolution (SISR). However, recent advances in efficient super-resolution focus on reducing the number of parameters and FLOPs, and they aggregate more powerful features by improving feature utilization through compl...
Saved in:
Main Authors | , , , , , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
16.05.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Deep learning based approaches has achieved great performance in single image
super-resolution (SISR). However, recent advances in efficient super-resolution
focus on reducing the number of parameters and FLOPs, and they aggregate more
powerful features by improving feature utilization through complex layer
connection strategies. These structures may not be necessary to achieve higher
running speed, which makes them difficult to be deployed to
resource-constrained devices. In this work, we propose a novel Residual Local
Feature Network (RLFN). The main idea is using three convolutional layers for
residual local feature learning to simplify feature aggregation, which achieves
a good trade-off between model performance and inference time. Moreover, we
revisit the popular contrastive loss and observe that the selection of
intermediate features of its feature extractor has great influence on the
performance. Besides, we propose a novel multi-stage warm-start training
strategy. In each stage, the pre-trained weights from previous stages are
utilized to improve the model performance. Combined with the improved
contrastive loss and training strategy, the proposed RLFN outperforms all the
state-of-the-art efficient image SR models in terms of runtime while
maintaining both PSNR and SSIM for SR. In addition, we won the first place in
the runtime track of the NTIRE 2022 efficient super-resolution challenge. Code
will be available at https://github.com/fyan111/RLFN. |
---|---|
DOI: | 10.48550/arxiv.2205.07514 |