Fast, Accurate and Lightweight Super-Resolution with Neural Architecture Search

Deep convolutional neural networks demonstrate impressive results in the super-resolution domain. A series of studies concentrate on improving peak signal noise ratio (PSNR) by using much deeper layers, which are not friendly to constrained resources. Pursuing a trade-off between the restoration cap...

Full description

Saved in:
Bibliographic Details
Published in2020 25th International Conference on Pattern Recognition (ICPR) pp. 59 - 64
Main Authors Chu, Xiangxiang, Zhang, Bo, Ma, Hailong, Xu, Ruijun, Li, Qingyuan
Format Conference Proceeding
LanguageEnglish
Published IEEE 10.01.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Deep convolutional neural networks demonstrate impressive results in the super-resolution domain. A series of studies concentrate on improving peak signal noise ratio (PSNR) by using much deeper layers, which are not friendly to constrained resources. Pursuing a trade-off between the restoration capacity and the simplicity of models is still non-trivial. Recent contributions are struggling to manually maximize this balance, while our work achieves the same goal automatically with neural architecture search. Specifically, we handle super-resolution with a multi-objective approach. We also propose an elastic search tactic at both micro and macro level, based on a hybrid controller that profits from evolutionary computation and reinforcement learning. Quantitative experiments help us to draw a conclusion that our generated models dominate most of the state-of-the-art methods with respect to the individual FLOPS.
DOI:10.1109/ICPR48806.2021.9413080