Morphological Segmentation of Urban Structures

Automatic segmentation of high-resolution remote sensing imagery is an important problem in urban applications because the resulting segmentations can provide valuable spatial and structural information that are complementary to pixel-based spectral information in classification. We present a method...

Full description

Saved in:
Bibliographic Details
Published in2007 Urban Remote Sensing Joint Event pp. 1 - 6
Main Authors Akcay, H.G., Aksoy, S.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.04.2007
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Automatic segmentation of high-resolution remote sensing imagery is an important problem in urban applications because the resulting segmentations can provide valuable spatial and structural information that are complementary to pixel-based spectral information in classification. We present a method that combines structural information extracted by morphological processing with spectral information summarized using principal components analysis to produce precise segmentations that are also robust to noise. First, principal components are computed from hyper-spectral data to obtain representative bands. Then, candidate regions are extracted by applying connected components analysis to the pixels selected according to their morphological profiles computed using opening and closing by reconstruction with increasing structuring element sizes. Next, these regions are represented using a tree, and the most meaningful ones are selected by optimizing a measure that consists of two factors: spectral homogeneity, which is calculated in terms of variances of spectral features, and neighborhood connectivity, which is calculated using sizes of connected components. The experiments show that the method is able to detect structures in the image which are more precise and more meaningful than the structures detected by another approach that does not make strong use of neighborhood and spectral information.
ISBN:9781424407118
1424407117
ISSN:2334-0932
2642-9535
DOI:10.1109/URS.2007.371765