Saliency-based feature fusion convolutional network for blind image quality assessment

Quality assessment plays an important role in promoting the prevalence of digital imaging technology as well as the associated products. Since the human being is the ultimate assessor of image quality, the human visual system model has received much attention. In this paper, we present a novel IQA a...

Full description

Saved in:
Bibliographic Details
Published inSignal, image and video processing Vol. 16; no. 2; pp. 419 - 427
Main Authors Shen, Lili, Zhang, Chuhe, Hou, Chunping
Format Journal Article
LanguageEnglish
Published London Springer London 01.03.2022
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN1863-1703
1863-1711
DOI10.1007/s11760-021-01958-7

Cover

Loading…
More Information
Summary:Quality assessment plays an important role in promoting the prevalence of digital imaging technology as well as the associated products. Since the human being is the ultimate assessor of image quality, the human visual system model has received much attention. In this paper, we present a novel IQA approach via analysis of human visual characteristics. Given that salient regions have greater impacts on subjects’ judgments of image quality, a saliency-based filtering model is first designed to collect saliency patches, and a saliency weighting matrix is obtained to represent their priority. Second, to learn more effective feature representations, we design a sub-network with up-sampling layers to capture features from different levels. Features are synthesized by utilizing a feature fusion convolutional network with two-stream structure. Features from different levels are mapped to a local score. Finally, the local score of each salient patch is summarized by a saliency-weighting model to work out the final predicted score. The experimental results on a series of publicly available databases, e.g., LIVE, CISQ and TID2013 demonstrate that the proposed method outperforms other state-of-the-art methods.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1863-1703
1863-1711
DOI:10.1007/s11760-021-01958-7