Back to Search
Start Over
Out-of-Sample Extensions for Non-Parametric Kernel Methods.
- Source :
-
IEEE Transactions on Neural Networks & Learning Systems . Feb2017, Vol. 28 Issue 2, p334-345. 12p. - Publication Year :
- 2017
-
Abstract
- Choosing suitable kernels plays an important role in the performance of kernel methods. Recently, a number of studies were devoted to developing nonparametric kernels. Without assuming any parametric form of the target kernel, nonparametric kernel learning offers a flexible scheme to utilize the information of the data, which may potentially characterize the data similarity better. The kernel methods using nonparametric kernels are referred to as nonparametric kernel methods. However, many nonparametric kernel methods are restricted to transductive learning, where the prediction function is defined only over the data points given beforehand. They have no straightforward extension for the out-of-sample data points, and thus cannot be applied to inductive learning. In this paper, we show how to make the nonparametric kernel methods applicable to inductive learning. The key problem of out-of-sample extension is how to extend the nonparametric kernel matrix to the corresponding kernel function. A regression approach in the hyper reproducing kernel Hilbert space is proposed to solve this problem. Empirical results indicate that the out-of-sample performance is comparable to the in-sample performance in most cases. Experiments on face recognition demonstrate the superiority of our nonparametric kernel method over the state-of-the-art parametric kernel methods. [ABSTRACT FROM AUTHOR]
- Subjects :
- *KERNEL functions
*NONPARAMETRIC statistics
*SUPPORT vector machines
Subjects
Details
- Language :
- English
- ISSN :
- 2162237X
- Volume :
- 28
- Issue :
- 2
- Database :
- Academic Search Index
- Journal :
- IEEE Transactions on Neural Networks & Learning Systems
- Publication Type :
- Periodical
- Accession number :
- 120846826
- Full Text :
- https://doi.org/10.1109/TNNLS.2015.2512277