Back to Search
Start Over
Least Square Regularized Regression in Sum Space
- Source :
- IEEE Transactions on Neural Networks and Learning Systems; 2013, Vol. 24 Issue: 4 p635-646, 12p
- Publication Year :
- 2013
-
Abstract
- This paper proposes a least square regularized regression algorithm in sum space of reproducing kernel Hilbert spaces (RKHSs) for nonflat function approximation, and obtains the solution of the algorithm by solving a system of linear equations. This algorithm can approximate the low- and high-frequency component of the target function with large and small scale kernels, respectively. The convergence and learning rate are analyzed. We measure the complexity of the sum space by its covering number and demonstrate that the covering number can be bounded by the product of the covering numbers of basic RKHSs. For sum space of RKHSs with Gaussian kernels, by choosing appropriate parameters, we tradeoff the sample error and regularization error, and obtain a polynomial learning rate, which is better than that in any single RKHS. The utility of this method is illustrated with two simulated data sets and five real-life databases.
Details
- Language :
- English
- ISSN :
- 2162237x and 21622388
- Volume :
- 24
- Issue :
- 4
- Database :
- Supplemental Index
- Journal :
- IEEE Transactions on Neural Networks and Learning Systems
- Publication Type :
- Periodical
- Accession number :
- ejs29618537
- Full Text :
- https://doi.org/10.1109/TNNLS.2013.2242091