Back to Search Start Over

Minimax Estimation of Discrete Distributions Under \ell 1 Loss.

Authors :
Han, Yanjun
Jiao, Jiantao
Weissman, Tsachy
Source :
IEEE Transactions on Information Theory; Nov2015, Vol. 61 Issue 11, p6343-6354, 12p
Publication Year :
2015

Abstract

We consider the problem of discrete distribution estimation under \ell 1 loss. We provide tight upper and lower bounds on the maximum risk of the empirical distribution (the maximum likelihood estimator), and the minimax risk in regimes where the support size $S$ may grow with the number of observations $n$ . We show that among distributions with bounded entropy $H$ , the asymptotic maximum risk for the empirical distribution is $2H/\ln n$ , while the asymptotic minimax risk is $H/\ln n$ . Moreover, we show that a hard-thresholding estimator oblivious to the unknown upper bound $H$ , is essentially minimax. However, if we constrain the estimates to lie in the simplex of probability distributions, then the asymptotic minimax risk is again $2H/\ln n$ . We draw connections between our work and the literature on density estimation, entropy estimation, total variation distance ( \ell 1 divergence) estimation, joint distribution estimation in stochastic processes, normal mean estimation, and adaptive estimation. [ABSTRACT FROM PUBLISHER]

Details

Language :
English
ISSN :
00189448
Volume :
61
Issue :
11
Database :
Complementary Index
Journal :
IEEE Transactions on Information Theory
Publication Type :
Academic Journal
Accession number :
110439821
Full Text :
https://doi.org/10.1109/TIT.2015.2478816