Exploiting Universum data in AdaBoost using gradient descent
Recently, Universum data that does not belong to any class of the training data, has been applied for training better classifiers. In this paper, we address a novel boosting algorithm called UAdaBoost that can improve the classification performance of AdaBoost with Universum data. UAdaBoost chooses...
Saved in:
Published in | Image and vision computing Vol. 32; no. 8; pp. 550 - 557 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Elsevier B.V
01.08.2014
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Recently, Universum data that does not belong to any class of the training data, has been applied for training better classifiers. In this paper, we address a novel boosting algorithm called UAdaBoost that can improve the classification performance of AdaBoost with Universum data. UAdaBoost chooses a function by minimizing the loss for labeled data and Universum data. The cost function is minimized by a greedy, stagewise, functional gradient procedure. Each training stage of UAdaBoost is fast and efficient. The standard AdaBoost weights labeled samples during training iterations while UAdaBoost gives an explicit weighting scheme for Universum samples as well. In addition, this paper describes the practical conditions for the effectiveness of Universum learning. These conditions are based on the analysis of the distribution of ensemble predictions over training samples. Experiments on handwritten digits classification and gender classification problems are presented. As exhibited by our experimental results, the proposed method can obtain superior performances over the standard AdaBoost by selecting proper Universum data.
[Display omitted]
•We address a novel boosting algorithm by taking advantage of Universum data.•A greedy, stagewise, functional gradient procedure is taken to derive the method.•Explicit weighting schemes for labeled and Universum samples are provided.•Practical conditions to verify effectiveness of Universum learning are described.•This algorithm obtains superior performances over AdaBoost with Universum data. |
---|---|
Bibliography: | ObjectType-Article-2 SourceType-Scholarly Journals-1 ObjectType-Feature-1 content type line 23 |
ISSN: | 0262-8856 1872-8138 |
DOI: | 10.1016/j.imavis.2014.04.009 |