Back to Search Start Over

Feature Selection and Fast Training of Subspace Based Support Vector Machines

Authors :
Shigeo Abe
Syogo Takeuchi
Takuya Kitamura
Source :
ResearcherID, IJCNN

Abstract

In this paper, we propose two methods for subspace based support vector machines (SS-SVMs) which are subspace based least squares support vector machines (SSLS-SVMs) and subspace based linear programming support vector machines (SSLP-SVMs): 1) optimum selection of the dictionaries of each class subspace from the standpoint of classification separability, and 2) speeding up training SS-SVMs. In method 1), for SSLS-SVMs, we select the dictionaries with optimized weights, and for SSLP-SVMs, we select the dictionaries without non-negative constraints. In method 2), the empirical feature space is obtained by using only the training data belonging to a class instead of using all the training data. Thus the dimension of the empirical feature space and training cost become lower. We demonstrate the effectiveness of the proposed methods over the conventional method for two-class bench mark datasets.

Details

Database :
OpenAIRE
Journal :
ResearcherID, IJCNN
Accession number :
edsair.doi.dedup.....e83f82601defc11a6ac4cf9c582c8cbe