Adjustable super-resolution network via deep supervised learning and progressive self-distillation

With the use of convolutional neural networks, Single-Image Super-Resolution (SISR) has advanced dramatically in recent years. However, we notice a phenomenon that the structure of all these models must be consistent during training and testing. This severely limits the flexibility of the model, mak...

Full description

Saved in:
Bibliographic Details
Published inNeurocomputing (Amsterdam) Vol. 500; pp. 379 - 393
Main Authors Li, Juncheng, Fang, Faming, Zeng, Tieyong, Zhang, Guixu, Wang, Xizhao
Format Journal Article
LanguageEnglish
Published Elsevier B.V 21.08.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:With the use of convolutional neural networks, Single-Image Super-Resolution (SISR) has advanced dramatically in recent years. However, we notice a phenomenon that the structure of all these models must be consistent during training and testing. This severely limits the flexibility of the model, making the same model difficult to be deployed on different sizes of platforms (e.g., computers, smartphones, and embedded devices). Therefore, it is crucial to develop a model that can adapt to different needs without retraining. To achieve this, we propose a lightweight Adjustable Super-Resolution Network (ASRN). Specifically, ASRN consists of a series of Multi-scale Aggregation Blocks (MABs), which is a lightweight and efficient module specially designed for feature extraction. Meanwhile, the Deep Supervised Learning (DSL) strategy is introduced into the model to guarantee the performance of each sub-network and a novel Progressive Self-Distillation (PSD) strategy is proposed to further improve the intermediate results of the model. With the help of DSL and PSD strategies, ASRN can achieve elastic image reconstruction. Meanwhile, ASRN is the first elastic SISR model, which shows good results after directly changing the model size without retraining.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2022.05.061