Back to Search
Start Over
Learning theory of minimum error entropy under weak moment conditions.
- Source :
-
Analysis & Applications . Jan2022, Vol. 20 Issue 1, p121-139. 19p. - Publication Year :
- 2022
-
Abstract
- Minimum error entropy (MEE) is an information theoretic learning approach that minimizes the information contained in the prediction error, which is measured by entropy. It has been successfully used in various machine learning tasks for its robustness to heavy-tailed distributions and outliers. In this paper, we consider its use in nonparametric regression and analyze its generalization performance from a learning theory perspective by imposing a (1 +) th order moment condition on the noise variable. To this end, we establish a comparison theorem to characterize the relation between the excess generalization error and the prediction error. A relaxed Bernstein condition and concentration inequalities are used to derive error bounds and learning rates. Note that the (1 +) th moment condition is rather weak particularly when < 1 because the noise variable does not even admit a finite variance in this case. Therefore, our analysis explains the robustness of MEE in the presence of heavy-tailed distributions. [ABSTRACT FROM AUTHOR]
- Subjects :
- *ERROR analysis in mathematics
*ENTROPY (Information theory)
*MACHINE learning
Subjects
Details
- Language :
- English
- ISSN :
- 02195305
- Volume :
- 20
- Issue :
- 1
- Database :
- Academic Search Index
- Journal :
- Analysis & Applications
- Publication Type :
- Academic Journal
- Accession number :
- 154930065
- Full Text :
- https://doi.org/10.1142/S0219530521500044