Back to Search
Start Over
Global Convergence of SMO Algorithm for Support Vector Regression
- Source :
- IEEE Transactions on Neural Networks. 19:971-982
- Publication Year :
- 2008
- Publisher :
- Institute of Electrical and Electronics Engineers (IEEE), 2008.
-
Abstract
- Global convergence of the sequential minimal optimization (SMO) algorithm for support vector regression (SVR) is studied in this paper. Given l training samples, SVR is formulated as a convex quadratic programming (QP) problem with l pairs of variables. We prove that if two pairs of variables violating the optimality condition are chosen for update in each step and subproblems are solved in a certain way, then the SMO algorithm always stops within a finite number of iterations after finding an optimal solution. Also, efficient implementation techniques for the SMO algorithm are presented and compared experimentally with other SMO algorithms.
- Subjects :
- Mathematical optimization
Artificial neural network
Optimality criterion
Computer Networks and Communications
Computer science
Signal Processing, Computer-Assisted
Regression analysis
General Medicine
Pattern Recognition, Automated
Computer Science Applications
Support vector machine
Artificial Intelligence
Convergence (routing)
Convex optimization
Animals
Humans
Sequential minimal optimization
Computer Simulation
Neural Networks, Computer
Quadratic programming
Finite set
Algorithm
Algorithms
Software
Subjects
Details
- ISSN :
- 19410093 and 10459227
- Volume :
- 19
- Database :
- OpenAIRE
- Journal :
- IEEE Transactions on Neural Networks
- Accession number :
- edsair.doi.dedup.....19bef315662748841b094dc4f7af1c2f