Back to Search
Start Over
Minimax Optimal Estimation of KL Divergence for Continuous Distributions.
- Source :
- IEEE Transactions on Information Theory; Dec2020, Vol. 66 Issue 12, p7787-7811, 25p
- Publication Year :
- 2020
-
Abstract
- Estimating Kullback-Leibler divergence from identical and independently distributed samples is an important problem in various domains. One simple and effective estimator is based on the $k$ nearest neighbor distances between these samples. In this paper, we analyze the convergence rates of the bias and variance of this estimator. Furthermore, we derive a lower bound of the minimax mean square error and show that kNN method is asymptotically rate optimal. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 00189448
- Volume :
- 66
- Issue :
- 12
- Database :
- Complementary Index
- Journal :
- IEEE Transactions on Information Theory
- Publication Type :
- Academic Journal
- Accession number :
- 147291914
- Full Text :
- https://doi.org/10.1109/TIT.2020.3009923