Histogram Thresholding Using Fuzzy and Rough Measures of Association Error

This paper presents a novel histogram thresholding methodology using fuzzy and rough set theories. The strength of the proposed methodology lies in the fact that it does not make any prior assumptions about the histogram unlike many existing techniques. For bilevel thresholding, every element of the...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on image processing Vol. 18; no. 4; pp. 879 - 888
Main Authors Sen, D., Pal, S.K.
Format Journal Article
LanguageEnglish
Published New York, NY IEEE 01.04.2009
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper presents a novel histogram thresholding methodology using fuzzy and rough set theories. The strength of the proposed methodology lies in the fact that it does not make any prior assumptions about the histogram unlike many existing techniques. For bilevel thresholding, every element of the histogram is associated with one of the two regions by comparing the corresponding errors of association. The regions are considered ambiguous in nature, and, hence, the error measures are based on the fuzziness or roughness of the regions. Multilevel thresholding is carried out using the proposed bilevel thresholding method in a tree structured algorithm. Segmentation, object/background separation, and edge extraction are performed using the proposed methodology. A quantitative index to evaluate image segmentation performance is also proposed using the median of absolute deviation from median measure, which is a robust estimator of scale. Extensive experimental results are given to demonstrate the effectiveness of the proposed methods in terms of both qualitative and quantitative measures.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ObjectType-Article-2
ObjectType-Feature-1
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2009.2012890