Jensen-Shannon boosting learning for object recognition
In this paper, we propose a novel learning method, called Jensen-Shannon Boosting (JSBoost) and demonstrate its application to object recognition. JSBoost incorporates Jensen-Shannon (JS) divergence [Y. Bubner et al. (2001)] into AdaBoost learning. JS divergence is advantageous in that it provides m...
Saved in:
Published in | 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) Vol. 2; pp. 144 - 149 vol. 2 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
2005
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In this paper, we propose a novel learning method, called Jensen-Shannon Boosting (JSBoost) and demonstrate its application to object recognition. JSBoost incorporates Jensen-Shannon (JS) divergence [Y. Bubner et al. (2001)] into AdaBoost learning. JS divergence is advantageous in that it provides more appropriate measure of dissimilarity between two classes and it is numerically more stable than other measures such as Kullback-Leibler (KL) divergence (see [Y. Bubner et al. (2001)]). The best features are iteratively learned by maximizing the projected JS divergence, based on which best weak classifiers are derived. The weak classifiers are combined into a strong one by minimizing the recognition error. JSBoost learning is demonstrated with face object recognition using a local binary pattern (LBP) [M. Pietikainen et al. (2004)] based representation. JSBoost selects the best LBP features from thousands of candidate features and constructs a strong classifier based on the selected features. JSBoost empirically produces better face recognition results than other AdaBoost variants such as RealBoost [R.E. Schapire et al. (1998)], GentleBoost [J. Friedman et al. (2000)] and KL-Boost [C. Liu et al. (2003)], as demonstrated by experiments. |
---|---|
ISBN: | 0769523722 9780769523729 |
ISSN: | 1063-6919 1063-6919 |
DOI: | 10.1109/CVPR.2005.197 |