Back to Search Start Over

Feature selection by optimizing a lower bound of conditional mutual information.

Authors :
Peng, Hanyang
Fan, Yong
Source :
Information Sciences. Nov2017, Vol. 418, p652-667. 16p.
Publication Year :
2017

Abstract

A unified framework is proposed to select features by optimizing computationally feasible approximations of high-dimensional conditional mutual information (CMI) between features and their associated class label under different assumptions. Under this unified framework, state-of-the-art information theory based feature selection algorithms are re-derived, and a new algorithm is proposed to select features by optimizing a lower bound of the CMI with a weaker assumption than those adopted by existing methods. The new feature selection method integrates a plug-in component to distinguish redundant features from irrelevant ones for improving the feature selection robustness. Furthermore, a novel metric is proposed to evaluate feature selection methods based on simulated data. The proposed method has been compared with state-of-the-art feature selection methods based on the new evaluation metric and classification performance of classifiers built upon the selected features. The experiment results have demonstrated that the proposed method could achieve promising performance in a variety of feature selection problems. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00200255
Volume :
418
Database :
Academic Search Index
Journal :
Information Sciences
Publication Type :
Periodical
Accession number :
125057514
Full Text :
https://doi.org/10.1016/j.ins.2017.08.036