Fast SAR Image Segmentation via Merging Cost With Relative Common Boundary Length Penalty

In this paper, a region-merging-based method is proposed for fast segmentation of amplitude-format synthetic aperture radar (SAR) images. It combines the existing fast initial partition by applying the watershed transform to the thresholded ratio edge strength map with fast region merging by using a...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on geoscience and remote sensing Vol. 52; no. 10; pp. 6434 - 6448
Main Authors Shui, Peng-Lang, Zhang, Ze-Jun
Format Journal Article
LanguageEnglish
Published New York IEEE 01.10.2014
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN0196-2892
1558-0644
DOI10.1109/TGRS.2013.2296561

Cover

More Information
Summary:In this paper, a region-merging-based method is proposed for fast segmentation of amplitude-format synthetic aperture radar (SAR) images. It combines the existing fast initial partition by applying the watershed transform to the thresholded ratio edge strength map with fast region merging by using a new merging cost with relative common boundary length penalty (RCBLP) and the nearest neighbor graph (NNG) for fast search minimal edge on a region adjacency graph (RAG). A new statistical similarity measure, which is a scale-invariant and approximately constant false alarm rate with respect to region sizes, is proposed and combined with an RCBLP term to form a new merging cost. The region-merging process starting from the initial partition is fast implemented by means of the RAG and NNG. Several quantitative indexes in optical image segmentation assessment are borrowed for quantitative assessment of segmentation quality. Experiments to synthetic and real SAR images are reported. The results show that the proposed method is fast and attains higher quality segmentation results than the two recent state-of-the-art methods.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2013.2296561