Image region annotation based on segmentation and semantic correlation analysis

The authors propose an image region annotation framework by exploring syntactic and semantic correlations among segmented regions in an image. A texture-enhanced image segmentation JSEG algorithm is first used to improve the pixel consistency in a segmented image region. Next, each region is represe...

Full description

Saved in:
Bibliographic Details
Published inIET image processing Vol. 12; no. 8; pp. 1331 - 1337
Main Authors Zhang, Jing, Mu, Yakun, Feng, Shengwei, Li, Kehuang, Yuan, Yubo, Lee, Chin-Hui
Format Journal Article
LanguageEnglish
Published The Institution of Engineering and Technology 01.08.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The authors propose an image region annotation framework by exploring syntactic and semantic correlations among segmented regions in an image. A texture-enhanced image segmentation JSEG algorithm is first used to improve the pixel consistency in a segmented image region. Next, each region is represented by a set of image codewords, also known as visual alphabets, with each of them used to characterise certain low-level image features. A visual lexicon, with its vocabulary items defined as either a codeword or a co-occurrence of multiple alphabets, is formed and used to model middle-level semantic concepts. The concept classification models are trained by a maximal figure-of-merit algorithm with a collection of training images with multiple correlations, including spatial, syntactic and semantic relationship, between regions and their corresponding concepts. In addition, a region-semantic correlation model constructed with latent semantic analysis is used to correct the potentially wrong annotations by analysing the relationship between image region positions and labels. When evaluated on the Corel 5K dataset, the proposed image region annotation framework achieves accurate results on image region concept tagging as well as whole image based annotations.
ISSN:1751-9659
1751-9667
1751-9667
DOI:10.1049/iet-ipr.2017.0917