Back to Search Start Over

Improvement of the kernel minimum squared error model for fast feature extraction.

Authors :
Wang, Jinghua
Wang, Peng
Li, Qin
You, Jane
Source :
Neural Computing & Applications. Jul2013, Vol. 23 Issue 1, p53-59. 7p. 2 Charts.
Publication Year :
2013

Abstract

The kernel minimum squared error (KMSE) expresses the feature extractor as a linear combination of all the training samples in the high-dimensional kernel space. To extract a feature from a sample, KMSE should calculate as many kernel functions as the training samples. Thus, the computational efficiency of the KMSE-based feature extraction procedure is inversely proportional to the size of the training sample set. In this paper, we propose an efficient kernel minimum squared error (EKMSE) model for two-class classification. The proposed EKMSE expresses each feature extractor as a linear combination of nodes, which are a small portion of the training samples. To extract a feature from a sample, EKMSE only needs to calculate as many kernel functions as the nodes. As the nodes are commonly much fewer than the training samples, EKMSE is much faster than KMSE in feature extraction. The EKMSE can achieve the same training accuracy as the standard KMSE. Also, EKMSE avoids the overfitting problem. We implement the EKMSE model using two algorithms. Experimental results show the feasibility of the EKMSE model. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09410643
Volume :
23
Issue :
1
Database :
Academic Search Index
Journal :
Neural Computing & Applications
Publication Type :
Academic Journal
Accession number :
88349595
Full Text :
https://doi.org/10.1007/s00521-012-0813-9