Back to Search
Start Over
Kernel gradient descent algorithm for information theoretic learning
- Source :
- Journal of Approximation Theory. 263:105518
- Publication Year :
- 2021
- Publisher :
- Elsevier BV, 2021.
-
Abstract
- Information theoretic learning is a learning paradigm that uses concepts of entropies and divergences from information theory. A variety of signal processing and machine learning methods fall into this framework. Minimum error entropy principle is a typical one amongst them. In this paper, we study a kernel version of minimum error entropy methods that can be used to find nonlinear structures in the data. We show that the kernel minimum error entropy can be implemented by kernel based gradient descent algorithms with or without regularization. Convergence rates for both algorithms are deduced.
- Subjects :
- Numerical Analysis
Signal processing
Applied Mathematics
General Mathematics
010102 general mathematics
010103 numerical & computational mathematics
Information theory
01 natural sciences
Regularization (mathematics)
Kernel method
Kernel (statistics)
Convergence (routing)
Entropy (information theory)
0101 mathematics
Gradient descent
Algorithm
Analysis
Mathematics
Subjects
Details
- ISSN :
- 00219045
- Volume :
- 263
- Database :
- OpenAIRE
- Journal :
- Journal of Approximation Theory
- Accession number :
- edsair.doi...........1e3718579129a54eabba807c7f2b044f