Back to Search Start Over

Non-convex Regularizations for Feature Selection in Ranking With Sparse SVM

Authors :
Laporte, Léa
Flamary, Rémi
Canu, Stephane
Déjean, Sébastien
Mothe, Josiane
Source :
IEEE Transactions on Neural Networks and Learning Systems, IEEE, 2013, pp.1,1
Publication Year :
2015

Abstract

Feature selection in learning to rank has recently emerged as a crucial issue. Whereas several preprocessing approaches have been proposed, only a few works have been focused on integrating the feature selection into the learning process. In this work, we propose a general framework for feature selection in learning to rank using SVM with a sparse regularization term. We investigate both classical convex regularizations such as $\ell\_1$ or weighted $\ell\_1$ and non-convex regularization terms such as log penalty, Minimax Concave Penalty (MCP) or $\ell\_p$ pseudo norm with $p\textless{}1$. Two algorithms are proposed, first an accelerated proximal approach for solving the convex problems, second a reweighted $\ell\_1$ scheme to address the non-convex regularizations. We conduct intensive experiments on nine datasets from Letor 3.0 and Letor 4.0 corpora. Numerical results show that the use of non-convex regularizations we propose leads to more sparsity in the resulting models while prediction performance is preserved. The number of features is decreased by up to a factor of six compared to the $\ell\_1$ regularization. In addition, the software is publicly available on the web.

Subjects

Subjects :
Computer Science - Learning

Details

Database :
arXiv
Journal :
IEEE Transactions on Neural Networks and Learning Systems, IEEE, 2013, pp.1,1
Publication Type :
Report
Accession number :
edsarx.1507.00500
Document Type :
Working Paper
Full Text :
https://doi.org/10.1109/TNNLS.2013.2286696