Back to Search Start Over

Kernel gradient descent algorithm for information theoretic learning

Authors :
Ting Hu
Qiang Wu
Ding-Xuan Zhou
Source :
Journal of Approximation Theory. 263:105518
Publication Year :
2021
Publisher :
Elsevier BV, 2021.

Abstract

Information theoretic learning is a learning paradigm that uses concepts of entropies and divergences from information theory. A variety of signal processing and machine learning methods fall into this framework. Minimum error entropy principle is a typical one amongst them. In this paper, we study a kernel version of minimum error entropy methods that can be used to find nonlinear structures in the data. We show that the kernel minimum error entropy can be implemented by kernel based gradient descent algorithms with or without regularization. Convergence rates for both algorithms are deduced.

Details

ISSN :
00219045
Volume :
263
Database :
OpenAIRE
Journal :
Journal of Approximation Theory
Accession number :
edsair.doi...........1e3718579129a54eabba807c7f2b044f