Switching between selection and fusion in combining classifiers: an experiment
This paper presents a combination of classifier selection and fusion by using statistical inference to switch between the two. Selection is applied in those regions of the feature space where one classifier strongly dominates the others from the pool [called clustering-and-selection or (CS)] and fus...
Saved in:
Published in | IEEE transactions on systems, man and cybernetics. Part B, Cybernetics Vol. 32; no. 2; pp. 146 - 156 |
---|---|
Main Author | |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.04.2002
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | This paper presents a combination of classifier selection and fusion by using statistical inference to switch between the two. Selection is applied in those regions of the feature space where one classifier strongly dominates the others from the pool [called clustering-and-selection or (CS)] and fusion is applied in the remaining regions. Decision templates (DT) method is adopted for the classifier fusion part. The proposed combination scheme (called CS+DT) is compared experimentally against its two components, and also against majority vote, naive Bayes, two joint-distribution methods (BKS and a variant due to Wernecke (1988)), the dynamic classifier selection (DCS) algorithm DCS_LA based on local accuracy (Woods et al. (1997)), and simple fusion methods such as maximum, minimum, average, and product. Based on the results with five data sets with homogeneous ensembles [multilayer perceptrons (NLPs)] and ensembles of different classifiers, we offer a discussion on when to combine classifiers and how classifier selection (static or dynamic) can be misled by the differences in the classifier team. |
---|---|
Bibliography: | ObjectType-Article-2 SourceType-Scholarly Journals-1 ObjectType-Feature-1 content type line 23 ObjectType-Article-1 ObjectType-Feature-2 |
ISSN: | 1083-4419 1941-0492 |
DOI: | 10.1109/3477.990871 |