Back to Search Start Over

Improving Sparsity and Scalability in Regularized Nonconvex Truncated-Loss Learning Problems.

Authors :
Tao, Qing
Wu, Gaowei
Chu, Dejun
Source :
IEEE Transactions on Neural Networks & Learning Systems; Jul2018, Vol. 29 Issue 7, p2782-2793, 12p
Publication Year :
2018

Abstract

The truncated regular $L_{1}$ -loss support vector machine can eliminate the excessive number of support vectors (SVs); thus, it has significant advantages in robustness and scalability. However, in this paper, we discover that the associated state-of-the-art solvers, such as difference convex algorithm and concave–convex procedure, not only have limited sparsity promoting property for general truncated losses especially the $L_{2}$ -loss but also have poor scalability for large-scale problems. To circumvent these drawbacks, we present a general multistage scheme with explicit interpretation regarding SVs as well as outliers. In particular, we solve the general nonconvex truncated loss minimization through a sequence of associated convex subproblems, in which the outliers are removed in advance. The proposed algorithm can be regarded as a structural optimization attempt carefully considering sparsity imposed by the nonconvex truncated losses. We show that this general multistage algorithm offers sufficient sparsity especially for the truncated $L_{2}$ -loss. To further improve the scalability, we propose a linear multistep algorithm by employing a single iteration of coordinate descent to monotonically decrease the objective function at each stage and a kernel algorithm by using the Karush–Kuhn–Tucker conditions to cheaply find most part of the outliers for the next stage. Comparison experiments demonstrate that our methods have superiority in sparsity as well as efficiency in scalability. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
2162237X
Volume :
29
Issue :
7
Database :
Complementary Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
130351492
Full Text :
https://doi.org/10.1109/TNNLS.2017.2705429