Granger Causality Driven AHP for Feature Weighted kNN
The kNN algorithm remains a popular choice for pattern classification till date due to its non-parametric nature, easy implementation and the fact that its classification error is bounded by twice the Bayes error. In this paper, we show that the performance of the kNN classifier improves significant...
Saved in:
Published in | Pattern recognition Vol. 66; pp. 425 - 436 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Elsevier Ltd
01.06.2017
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The kNN algorithm remains a popular choice for pattern classification till date due to its non-parametric nature, easy implementation and the fact that its classification error is bounded by twice the Bayes error. In this paper, we show that the performance of the kNN classifier improves significantly from the use of (training) class-wise group-statistics based two criteria during pairwise comparison of features in a given dataset. Granger causality is employed to assign preferences to each criteria. Analytic Hierarchy Process (AHP) is applied to obtain weights for different features from the two criteria and their preferences. Finally, these weights are used to build a weighted distance function for the kNN classification. Comprehensive experimentation on fifteen benchmark datasets of the UCI Machine Learning Repository clearly reveals the supremacy of the proposed Granger causality driven AHP induced kNN algorithm over the kNN method with many different distance metrics, and, with various feature selection strategies. In addition, the proposed method is also shown to perform well on high-dimensional face and hand-writing recognition datasets.
•Feature weighting for kNN by a multi-criteria based decision analysis tool called AHP.•Automated weight assignment in criteria matrix of AHP using group-statistics.•Criteria preference selection in AHP with Granger Causality.•Superior classification performance over kNN with many other feature weighting/selection methods. |
---|---|
ISSN: | 0031-3203 1873-5142 |
DOI: | 10.1016/j.patcog.2017.01.018 |