Blind quality assessment of multi-exposure fused images considering the detail, structure and color characteristics

In the process of multi-exposure image fusion (MEF), the appearance of various distortions will inevitably cause the deterioration of visual quality. It is essential to predict the visual quality of MEF images. In this work, a novel blind image quality assessment (IQA) method is proposed for MEF ima...

Full description

Saved in:
Bibliographic Details
Published inPloS one Vol. 18; no. 4
Main Authors Lijun Li, Caiming Zhong, Zhouyan He
Format Journal Article
LanguageEnglish
Published Public Library of Science (PLoS) 06.04.2023
Online AccessGet full text

Cover

Loading…
More Information
Summary:In the process of multi-exposure image fusion (MEF), the appearance of various distortions will inevitably cause the deterioration of visual quality. It is essential to predict the visual quality of MEF images. In this work, a novel blind image quality assessment (IQA) method is proposed for MEF images considering the detail, structure, and color characteristics. Specifically, to better perceive the detail and structure distortion, based on the joint bilateral filtering, the MEF image is decomposed into two layers (i.e., the energy layer and the structure layer). Obviously, this is a symmetric process that the two decomposition results can independently and almost completely describe the information of MEF images. As the former layer contains rich intensity information and the latter captures some image structures, some energy-related and structure-related features are extracted from these two layers to perceive the detail and structure distortion phenomena. Besides, some color-related features are also obtained to present the color degradation which are combined with the above energy-related and structure-related features for quality regression. Experimental results on the public MEF image database demonstrate that the proposed method achieves higher performance than the state-of-the-art quality assessment ones.
ISSN:1932-6203
DOI:10.1371/journal.pone.0283096