Back to Search Start Over

Discriminative Feature Selection via Employing Smooth and Robust Hinge Loss.

Authors :
Peng, Hanyang
Liu, Cheng-Lin
Source :
IEEE Transactions on Neural Networks & Learning Systems; Mar2019, Vol. 30 Issue 3, p788-802, 15p
Publication Year :
2019

Abstract

A wide variety of sparsity-inducing feature selection methods have been developed in recent years. Most of the loss functions of these approaches are built upon regression since it is general and easy to optimize, but regression is not well suitable for classification. In contrast, the hinge loss (HL) of support vector machines has proved to be powerful to handle classification tasks, but a model with existing multiclass HL and sparsity regularization is difficult to optimize. In view of that, we propose a new loss, called smooth and robust HL, which gathers the merits of regression and HL but overcome their drawbacks, and apply it to our sparsity regularized feature selection model. To optimize the model, we present a new variant of accelerated proximal gradient (APG) algorithm, which boosts the discriminative margins among different classes, compared with standard APG algorithms. We further propose an efficient optimization technique to solve the proximal projection problem at each iteration step, which is a key component of the new APG algorithm. We theoretically prove that the new APG algorithm converges at rate $O({1}/{k^{2}})$ if it is convex ($k$ is the iteration counter), which is the optimal convergence rate for smooth problems. Experimental results on nine publicly available data sets demonstrate the effectiveness of our method. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
2162237X
Volume :
30
Issue :
3
Database :
Complementary Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
134886708
Full Text :
https://doi.org/10.1109/TNNLS.2018.2852297