Feature Combination and the kNN Framework in Object Classification

In object classification, feature combination can usually be used to combine the strength of multiple complementary features and produce better classification results than any single one. While multiple kernel learning (MKL) is a popular approach to feature combination in object classification, it d...

Full description

Saved in:
Bibliographic Details
Published inIEEE transaction on neural networks and learning systems Vol. 27; no. 6; pp. 1368 - 1378
Main Authors Hou, Jian, Gao, Huijun, Xia, Qi, Qi, Naiming
Format Journal Article
LanguageEnglish
Published United States IEEE 01.06.2016
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In object classification, feature combination can usually be used to combine the strength of multiple complementary features and produce better classification results than any single one. While multiple kernel learning (MKL) is a popular approach to feature combination in object classification, it does not always perform well in practical applications. On one hand, the optimization process in MKL usually involves a huge consumption of computation and memory space. On the other hand, in some cases, MKL is found to perform no better than the baseline combination methods. This observation motivates us to investigate the underlying mechanism of feature combination with average combination and weighted average combination. As a result, we empirically find that in average combination, it is better to use a sample of the most powerful features instead of all, whereas in one type of weighted average combination, the best classification accuracy comes from a nearly sparse combination. We integrate these observations into the k-nearest neighbors (kNNs) framework, based on which we further discuss some issues related to sparse solution and MKL. Finally, by making use of the kNN framework, we present a new weighted average combination method, which is shown to perform better than MKL in both accuracy and efficiency in experiments. We believe that the work in this paper is helpful in exploring the mechanism underlying feature combination.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2015.2461552