Remote-sensing image retrieval by combining image visual and semantic features

This article presents a remote-sensing image retrieval scheme using image visual, object, and spatial relationship semantic features. It includes two main stages, namely offline multi-feature extraction and online query. In the offline stage, remote-sensing images are decomposed into several blocks...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of remote sensing Vol. 34; no. 12; pp. 4200 - 4223
Main Authors Wang, M, Wan, Q.M, Gu, L.B, Song, T.Y
Format Journal Article
LanguageEnglish
Published Abingdon Taylor & Francis 2013
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This article presents a remote-sensing image retrieval scheme using image visual, object, and spatial relationship semantic features. It includes two main stages, namely offline multi-feature extraction and online query. In the offline stage, remote-sensing images are decomposed into several blocks using the Quin-tree structure. Image visual features, including textures and colours, are extracted and stored. Further, object-oriented support vector machine (SVM) classification is carried out to obtain the image object semantic. A spatial relationship semantic is then obtained by a new spatial orientation description method. The online query stage, meanwhile, is a coarse-to-fine process that includes two sub-steps, which are a rough image retrieval based on the object semantic and a template-based fine image retrieval involving both visual and semantic features. This method is different from many other semantic-based remote-sensing image retrieval methods and is suitable for ‘scene matching’. Moreover, the scheme is distinctive in system design, spatial relationship semantic description, and method of combining and utilizing visual and semantic features.
Bibliography:http://dx.doi.org/10.1080/01431161.2013.774098
ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ObjectType-Article-1
ObjectType-Feature-2
ISSN:1366-5901
0143-1161
1366-5901
DOI:10.1080/01431161.2013.774098