A non-parametric approach to extending generic binary classifiers for multi-classification

Ensemble methods, which combine generic binary classifier scores to generate a multi-classification output, are commonly used in state-of-the-art computer vision and pattern recognition systems that rely on multi-classification. In particular, we consider the one-vs-one decomposition of the multi-cl...

Full description

Saved in:
Bibliographic Details
Published inPattern recognition Vol. 58; pp. 149 - 158
Main Authors Santhanam, Venkataraman, Morariu, Vlad I., Harwood, David, Davis, Larry S.
Format Journal Article
LanguageEnglish
Published Elsevier Ltd 01.10.2016
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Ensemble methods, which combine generic binary classifier scores to generate a multi-classification output, are commonly used in state-of-the-art computer vision and pattern recognition systems that rely on multi-classification. In particular, we consider the one-vs-one decomposition of the multi-class problem, where binary classifier models are trained to discriminate every class pair. We describe a robust multi-classification pipeline, which at a high level involves projecting binary classifier scores into compact orthogonal subspaces, followed by a non-linear probabilistic multi-classification step, using Kernel Density Estimation (KDE). We compare our approach against state-of-the-art ensemble methods (DCS, DRCW) on 16 multi-class datasets. We also compare against the most commonly used ensemble methods (VOTE, NEST) on 6 real-world computer vision datasets. Finally, we measure the statistical significance of our approach using non-parametric tests. Experimental results show that our approach gives a statistically significant improvement in multi-classification performance over state-of-the-art. •Ensemble methods combine binary classifiers to yield a multi-classification output.•One-vs-one ensemble: binary classifiers trained to discriminate each class pair.•We propose a robust non-parametric probabilistic one-vs-one ensemble method: KDEMRP.•KDEMRP improves classification performance over state-of-the-art (DCS, DRCW).•KDEMRP improvements are statistically significant.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0031-3203
1873-5142
DOI:10.1016/j.patcog.2016.04.008