151. Feature Selection and Fast Training of Subspace Based Support Vector Machines
- Author
-
Shigeo Abe, Syogo Takeuchi, and Takuya Kitamura
- Subjects
business.industry ,Feature vector ,Feature selection ,Pattern recognition ,Machine learning ,computer.software_genre ,Support vector machine ,Random subspace method ,Dimension (vector space) ,Least squares support vector machine ,Sequential minimal optimization ,Artificial intelligence ,business ,computer ,Subspace topology ,Mathematics - Abstract
In this paper, we propose two methods for subspace based support vector machines (SS-SVMs) which are subspace based least squares support vector machines (SSLS-SVMs) and subspace based linear programming support vector machines (SSLP-SVMs): 1) optimum selection of the dictionaries of each class subspace from the standpoint of classification separability, and 2) speeding up training SS-SVMs. In method 1), for SSLS-SVMs, we select the dictionaries with optimized weights, and for SSLP-SVMs, we select the dictionaries without non-negative constraints. In method 2), the empirical feature space is obtained by using only the training data belonging to a class instead of using all the training data. Thus the dimension of the empirical feature space and training cost become lower. We demonstrate the effectiveness of the proposed methods over the conventional method for two-class bench mark datasets.