1. On a new approach for Lagrangian support vector regression.
- Author
-
Balasundaram, S. and Benipal, Gagandeep
- Subjects
- *
LAGRANGE equations , *SUPPORT vector machines , *REGRESSION analysis , *ABSOLUTE value , *ITERATIVE methods (Mathematics) - Abstract
In this paper, a simplification of the necessary and sufficient Karush-Kuhn-Tucker (KKT) optimality conditions for the Lagrangian support vector regression in 2-norm resulting a naïve root finding problem of a system of equations in
m variables is proposed, wherem is the number of training examples. However, since the resulting system of equations contains terms having the nonsmooth “plus” function, two approaches are considered: (i) the problem is reformulated into an equivalent absolute value equation problem and solved by functional iterative and generalized Newton methods and (ii) solve the original root finding problem by generalized Newton method. The proofs of convergence and pseudo-codes of the iterative methods are given. Numerical experiments performed on a number of synthetic and real-world benchmark datasets showing similar generalization performance with much faster learning speed in comparison with support vector regression (SVR) and as good as performance in comparison with the least squares SVR, unconstrained Lagrangian SVR proposed in Balasundaram et al. (Neural Netw 51:67-79,2014 ), twin SVR and extreme learning machine clearly indicate the effectiveness and suitability of the proposed problem formulation solved by functional iterative and generalized Newton methods. [ABSTRACT FROM AUTHOR]- Published
- 2018
- Full Text
- View/download PDF