Back to Search Start Over

Subspace based least squares support vector machines for pattern classification

Authors :
Takuya Kitamura
Shigeo Abe
Kazuhiro Fukui
Source :
IJCNN, ResearcherID
Publication Year :
2009
Publisher :
Institute of Electrical and Electronics Engineers (IEEE), 2009.

Abstract

In this paper, we discuss subspace based least squares support vector machines (SSLS-SVMs), in which an input vector is classified into the class with the maximum similarity. Namely, we define the similarity measure for each class by the weighted sum of vectors called dictionaries and optimize the weights so that the margin between classes is optimized. Because the similarity measure is defined for each class, the similarity measure associated with a data sample needs to be the largest among all the similarity measures. Introducing slack variables we define these constraints by equality constraints. Then the proposed SSLS-SVMs is similar to LS-SVMs by all-at-once formulation. Because all-at-once formulation is inefficient, we also propose SSLS-SVMs by one-against-all formulation. We demonstrate the effectiveness of the proposed methods with the conventional method for two-class problems.

Details

Language :
English
Database :
OpenAIRE
Journal :
Neural Networks, 2009. IJCNN 2009. International Joint Conference on
Accession number :
edsair.doi.dedup.....de9b8fd5bc2a8665507cd9155d2db157