Feature Selection Based on Neighborhood Self-Information

The concept of dependency in a neighborhood rough set model is an important evaluation function for the feature selection. This function considers only the classification information contained in the lower approximation of the decision while ignoring the upper approximation. In this paper, we constr...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on cybernetics Vol. 50; no. 9; pp. 4031 - 4042
Main Authors Wang, Changzhong, Huang, Yang, Shao, Mingwen, Hu, Qinghua, Chen, Degang
Format Journal Article
LanguageEnglish
Published United States IEEE 01.09.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The concept of dependency in a neighborhood rough set model is an important evaluation function for the feature selection. This function considers only the classification information contained in the lower approximation of the decision while ignoring the upper approximation. In this paper, we construct a class of uncertainty measures: decision self-information for the feature selection. These measures take into account the uncertainty information in the lower and the upper approximations. The relationships between these measures and their properties are discussed in detail. It is proven that the fourth measure, called relative neighborhood self-information, is better for feature selection than the other measures, because not only does it consider both the lower and the upper approximations but also the change of its magnitude is largest with the variation of feature subsets. This helps to facilitate the selection of optimal feature subsets. Finally, a greedy algorithm for feature selection has been designed and a series of numerical experiments was carried out to verify the effectiveness of the proposed algorithm. The experimental results show that the proposed algorithm often chooses fewer features and improves the classification accuracy in most cases.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2168-2267
2168-2275
2168-2275
DOI:10.1109/TCYB.2019.2923430