Back to Search Start Over

Multiple Nonlinear Subspace Methods Using Subspace-based Support Vector Machines

Authors :
Shigeo Abe
Takuya Kitamura
Yusuke Tanaka
Source :
ICMLA (1)
Publication Year :
2011
Publisher :
IEEE, 2011.

Abstract

In this paper, we propose multiple nonlinear subspace methods (MNSMs), in which each class consists of several subspaces with different kernel parameters. For each class and each candidate kernel parameter, we generate the subspace by KPCA, and obtain the projection length of an input vector onto each subspace. Then, for each class, we define the discriminant function by the sum of the weighted lengths. These weights in the discriminant function are optimized by subspace-based support vector machines (SS-SVMs) so that the margin between classes is maximized while minimizing the classification error. Thus, we can weight the subspaces for each class from the standpoint of class separability. Then, the computational cost of the model selection of MNSMs is lower than that of SS-SVMs because for SS-SVMs two hyper-parameters, which are the kernel parameter and the margin parameter, must be chosen before training. We show the advantages of the proposed method by computer experiments with benchmark data sets.

Details

Database :
OpenAIRE
Journal :
2011 10th International Conference on Machine Learning and Applications and Workshops
Accession number :
edsair.doi...........d137a236067e6cf48911d163aadd70a7
Full Text :
https://doi.org/10.1109/icmla.2011.100