Back to Search
Start Over
Feature Selection via L1-Penalized Squared-Loss Mutual Information
- Publication Year :
- 2012
-
Abstract
- Feature selection is a technique to screen out less important features. Many existing supervised feature selection algorithms use redundancy and relevancy as the main criteria to select features. However, feature interaction, potentially a key characteristic in real-world problems, has not received much attention. As an attempt to take feature interaction into account, we propose L1-LSMI, an L1-regularization based algorithm that maximizes a squared-loss variant of mutual information between selected features and outputs. Numerical results show that L1-LSMI performs well in handling redundancy, detecting non-linear dependency, and considering feature interaction.<br />Comment: 25 pages
- Subjects :
- Statistics - Machine Learning
Computer Science - Learning
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.1210.1960
- Document Type :
- Working Paper
- Full Text :
- https://doi.org/10.1587/transinf.E96.D.1513