Image aesthetic assessment with weighted multi-region aggregation based on information theory

Image aesthetic assessment is a hot issue in current research. It will be very important to find regions that affect the aesthetic assessment of the image, for which we propose a weighted multi-region aesthetic assessment network WRMA-Net, which consists of three modules: information theory-based im...

Full description

Saved in:
Bibliographic Details
Published inPattern analysis and applications : PAA Vol. 28; no. 2
Main Authors Wang, Yin, Guo, Jing, Ke, Yongzhen, Wang, Kai, Yang, Shuai, Chen, Liming
Format Journal Article
LanguageEnglish
Published London Springer London 01.06.2025
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Image aesthetic assessment is a hot issue in current research. It will be very important to find regions that affect the aesthetic assessment of the image, for which we propose a weighted multi-region aesthetic assessment network WRMA-Net, which consists of three modules: information theory-based image segmentation module uses information theory to segment images; in the feature extraction module, we connect Convolutional Neural Network(CNN) and Graph Neural Network(GNN) in tandem, using CNN to obtain shallow detail information of the image and GNN to obtain deep semantic information of the image, which can retain feature information at each level, and subsequently fuse shallow and deep features to predict aesthetic assessment scores by the region weighting module; the weighted multi-region aggregation module assigns different weights to each region adaptively to adjust the prediction results and find high-quality aesthetic regions. The network can analyze image aesthetics from multiple regions and provide constructive regional aesthetic suggestions. The experimental results show that our WMRA-Net achieves good results in some aesthetic assessment metrics.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1433-7541
1433-755X
DOI:10.1007/s10044-025-01490-1