1. Support Vector Machine Classifier via Soft-Margin Loss
- Author
-
Yuan-Hai Shao, Naihua Xiu, Shenglong Zhou, Ce Zhang, and Huajun Wang
- Subjects
Ideal (set theory) ,business.industry ,Computer science ,Applied Mathematics ,Working set ,Regular polygon ,Optimality theory ,Soft margin ,Support vector machine ,Computational Theory and Mathematics ,Artificial Intelligence ,Robustness (computer science) ,Limit point ,Computer Vision and Pattern Recognition ,Artificial intelligence ,business ,Algorithm ,Software - Abstract
Support vector machines (SVM) have drawn wide attention for the last two decades due to its extensive applications, so a vast body of work has developed optimization algorithms to solve SVM with various soft-margin losses. To distinguish all, in this paper, we aim at solving an ideal soft-margin loss SVM: L0/1 soft-margin loss SVM (dubbed as L0/1-SVM). Many of the existing (non)convex soft-margin losses can be viewed as one of the surrogates of the L0/1 soft-margin loss. Despite its discrete nature, we manage to establish the optimality theory for the L0/1-SVM including the existence of the optimal solutions, the relationship between them and P-stationary points. These not only enable us to deliver a rigorous definition of L0/1 support vectors but also allow us to define a working set. Integrating such a working set, a fast alternating direction method of multipliers is then proposed with its limit point being a locally optimal solution to the L0/1-SVM. Finally, numerical experiments demonstrate that our proposed method outperforms some leading classification solvers from SVM communities, in terms of faster computational speed and a fewer number of support vectors. The bigger the data size is, the more evident its advantage appears.
- Published
- 2022
- Full Text
- View/download PDF