1. Subspace based linear programming support vector machines
- Author
-
Kazuhiro Fukui, Syogo Takeuchi, Shigeo Abe, and Takuya Kitamura
- Subjects
Support vector machine ,Kernel (linear algebra) ,Similarity (network science) ,Hyperplane ,business.industry ,Pattern recognition ,Artificial intelligence ,Similarity measure ,business ,Linear subspace ,Subspace topology ,Kernel principal component analysis ,Mathematics - Abstract
In subspace methods, the subspace associated with a class is represented by a small number of vectors called dictionaries and using the dictionaries the similarity measure is defined and an input is classified into the class with the highest similarity. Usually, each dictionary is given an equal weight. But if subspaces of different classes overlap, the similarity measures for the overlapping regions will not give useful information for classification. In this paper, we propose optimizing the weights for the dictionaries using the idea of support vector machines (SVMs). Namely, first we map the input space into the empirical feature space, perform kernel principal component analysis (KPCA) for each class, and define a similarity measure. Then considering that the similarity measure corresponds to the hyperplane, we formulate the optimization problem as maximizing the margin between the class associated with the dictionaries and the remaining classes. The optimization problem results in all-at-once formulation of linear SVMs. We demonstrate the effectiveness of the proposed method with that of the conventional methods for two-class problems.
- Published
- 2009