Back to Search Start Over

Finite-Sample Risk Bounds for Maximum Likelihood Estimation With Arbitrary Penalties.

Authors :
Brinda, W. D.
Klusowski, Jason M.
Source :
IEEE Transactions on Information Theory. Apr2018, Vol. 64 Issue 4, p2727-2741. 15p.
Publication Year :
2018

Abstract

The minimum description length two-part coding index of resolvability provides a finite-sample upper bound on the statistical risk of penalized likelihood estimators over countable models. However, the bound does not apply to unpenalized maximum likelihood estimation or procedures with exceedingly small penalties. In this paper, we point out a more general inequality that holds for arbitrary penalties. In addition, this approach makes it possible to derive exact risk bounds of order $1/n$ for iid parametric models, which improves on the order $(\log n)/n$ resolvability bounds. We conclude by discussing implications for adaptive estimation. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00189448
Volume :
64
Issue :
4
Database :
Academic Search Index
Journal :
IEEE Transactions on Information Theory
Publication Type :
Academic Journal
Accession number :
128558554
Full Text :
https://doi.org/10.1109/TIT.2017.2789214