Feature Selection Through Optimization of K-nearest Neighbor Matching Gain
Many problems in information processing involve some form of dimensionality reduction. In this paper, we propose a new model for feature evaluation and selection in unsupervised learning scenarios. The model makes no special assumptions on the nature of the data set. For each of the data set, the or...
Saved in:
Published in | 2010 International Conference on Intelligent Computation Technology and Automation Vol. 2; pp. 309 - 312 |
---|---|
Main Authors | , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.05.2010
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Many problems in information processing involve some form of dimensionality reduction. In this paper, we propose a new model for feature evaluation and selection in unsupervised learning scenarios. The model makes no special assumptions on the nature of the data set. For each of the data set, the original features induce a ranking list of items in its k nearest neighbors. The evaluation criterion favors reduced features that result in the most consistent to these ranked lists. And an efficiently local descent search based on the model is adopted to select the reduced features. Our experiments with several data sets demonstrate that the proposed algorithm is able to detect completely irrelevant features and to remove some additional features without significantly hurting the performance of the clustering algorithm. |
---|---|
ISBN: | 9781424472796 1424472792 |
DOI: | 10.1109/ICICTA.2010.608 |