Guided Superpixel Method for Topographic Map Processing

Superpixels have been widely used in lots of computer vision and image processing tasks but rarely used in topographic map processing due to the complex distribution of geographic elements in this kind of images. We propose a novel superpixel-generating method based on guided watershed transform (GW...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on geoscience and remote sensing Vol. 54; no. 11; pp. 6265 - 6279
Main Authors Miao, Qiguang, Liu, Tiange, Song, Jianfeng, Gong, Maoguo, Yang, Yun
Format Journal Article
LanguageEnglish
Published New York IEEE 01.11.2016
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Superpixels have been widely used in lots of computer vision and image processing tasks but rarely used in topographic map processing due to the complex distribution of geographic elements in this kind of images. We propose a novel superpixel-generating method based on guided watershed transform (GWT). Before GWT, the cues of geographic element distribution and boundaries between different elements need to be obtained. A linear feature extraction method based on a compound opposite Gaussian filter and a shear transform is presented to acquire the distribution information. Meanwhile, a boundary detection method, which based on the color-opponent mechanisms of the visual system, is employed to get the boundary information. Then, both linear features and boundaries are input to the final partition procedure to obtain superpixels. The experiments show that our method has the best performance in shape control, size control, and boundary adherence, among all the comparison methods, which are classic and state of the art. Furthermore, we verify the low complexity and low cost of memory in our method through experiments, which makes it possible to deal with large-scale topographic maps.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2016.2567481