1. A New Sparse LSSVM Method Based the Revised LARS
- Author
-
Shuisheng Zhou and Mengnan Liu
- Subjects
business.industry ,Computer science ,Least-angle regression ,Approximation algorithm ,Pattern recognition ,02 engineering and technology ,computer.software_genre ,01 natural sciences ,Matrix decomposition ,Support vector machine ,Kernel (linear algebra) ,Lasso (statistics) ,0103 physical sciences ,Least squares support vector machine ,0202 electrical engineering, electronic engineering, information engineering ,Feature (machine learning) ,020201 artificial intelligence & image processing ,Artificial intelligence ,Data mining ,010306 general physics ,business ,computer - Abstract
Least squares support vector machine (LSSVM) has comparable performance with support vector machine (SVM) and it has been widely used for classification and regression problems. The solutions of LSSVM are obtained by solving linear equations, but it lack of sparseness, which result in that it is unable to handle large-scale data sets. The state-of-art method least angle regression (LARS) can obtain a sparse solution by solving the Least Absolute Shrinkage and Selection Operator (LASSO) problem. So we use the idea of the LARS to obtain the sparse solution of the LSSVM, i.e., RLARS-LSSVM is proposed, which is an efficient method. The feature of the method is to select the most important samples as support vectors iteratively and to remove the samples that are similar to the selected support vectors simultaneously. Experimental results show that the proposed method can obtain much higher test accuracy compared with other sparse LSSVM methods at the same number of support vectors.
- Published
- 2017
- Full Text
- View/download PDF