Back to Search Start Over

Nearest Neighbor Density Functional Estimation From Inverse Laplace Transform.

Authors :
Ryu, J. Jon
Ganguly, Shouvik
Kim, Young-Han
Noh, Yung-Kyun
Lee, Daniel D.
Source :
IEEE Transactions on Information Theory; Jun2022, Vol. 68 Issue 6, p3511-3551, 41p
Publication Year :
2022

Abstract

A new approach to $L_{2}$ -consistent estimation of a general density functional using $k$ -nearest neighbor distances is proposed, where the functional under consideration is in the form of the expectation of some function $f$ of the densities at each point. The estimator is designed to be asymptotically unbiased, using the convergence of the normalized volume of a $k$ -nearest neighbor ball to a Gamma distribution in the large-sample limit, and naturally involves the inverse Laplace transform of a scaled version of the function $f$. Some instantiations of the proposed estimator recover existing $k$ -nearest neighbor based estimators of Shannon and Rényi entropies and Kullback–Leibler and Rényi divergences, and discover new consistent estimators for many other functionals such as logarithmic entropies and divergences. The $L_{2}$ -consistency of the proposed estimator is established for a broad class of densities for general functionals, and the convergence rate in mean squared error is established as a function of the sample size for smooth, bounded densities. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00189448
Volume :
68
Issue :
6
Database :
Complementary Index
Journal :
IEEE Transactions on Information Theory
Publication Type :
Academic Journal
Accession number :
157007230
Full Text :
https://doi.org/10.1109/TIT.2022.3151231