Back to Search Start Over

On a new approach for Lagrangian support vector regression.

Authors :
Balasundaram, S.
Benipal, Gagandeep
Source :
Neural Computing & Applications. May2018, Vol. 29 Issue 9, p533-551. 19p.
Publication Year :
2018

Abstract

In this paper, a simplification of the necessary and sufficient Karush-Kuhn-Tucker (KKT) optimality conditions for the Lagrangian support vector regression in 2-norm resulting a naïve root finding problem of a system of equations in <italic>m</italic> variables is proposed, where <italic>m</italic> is the number of training examples. However, since the resulting system of equations contains terms having the nonsmooth “plus” function, two approaches are considered: (i) the problem is reformulated into an equivalent absolute value equation problem and solved by functional iterative and generalized Newton methods and (ii) solve the original root finding problem by generalized Newton method. The proofs of convergence and pseudo-codes of the iterative methods are given. Numerical experiments performed on a number of synthetic and real-world benchmark datasets showing similar generalization performance with much faster learning speed in comparison with support vector regression (SVR) and as good as performance in comparison with the least squares SVR, unconstrained Lagrangian SVR proposed in Balasundaram et al. (Neural Netw 51:67-79, <xref>2014</xref>), twin SVR and extreme learning machine clearly indicate the effectiveness and suitability of the proposed problem formulation solved by functional iterative and generalized Newton methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09410643
Volume :
29
Issue :
9
Database :
Academic Search Index
Journal :
Neural Computing & Applications
Publication Type :
Academic Journal
Accession number :
128815173
Full Text :
https://doi.org/10.1007/s00521-016-2521-3