Face Image Super-resolution Reconstruction via Mapping Matrix and Multilayer Model

Making full use of the information provided by the training set has always been the target of face image super-resolution reconstruction in the case of small samples. To solve the problem of full usage, we propose a novel algorithm via mapping matrix and multilayer model. Firstly, we double the trai...

Full description

Saved in:
Bibliographic Details
Published inIAENG international journal of computer science Vol. 47; no. 1; p. 68
Main Authors Wang, Yan, Wang, Jianchun, Li, Fengju, Zhou, Yancong, Xiong, Min, Li, Ming
Format Journal Article
LanguageEnglish
Published Hong Kong International Association of Engineers 22.02.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Making full use of the information provided by the training set has always been the target of face image super-resolution reconstruction in the case of small samples. To solve the problem of full usage, we propose a novel algorithm via mapping matrix and multilayer model. Firstly, we double the training set by the flip method. Then, we extract one-step and two-step gradient features in the training set, and divide these face images and their feature images into many multi-scale patches. The reconstructed training set is formed by the patches which are in the same positions. We adopt high-resolution and low-resolution feature images simultaneously to build weights for the mapping matrix. To reduce local ghosting and blurring of the reconstructed image, the matrix constraint is constructed through processing the distance of different low-resolution and high-resolution feature image blocks. Last, the multilayer model is built by using image patches of different scales so that the reconstruction can reflect the degradation process of the image. The experimental results on the small sample database FERET, FEI and CAS-PEAL-R1 show that the proposed method can achieve better face image reconstruction quality compared with state-of-the-art methods.
ISSN:1819-656X
1819-9224