Back to Search Start Over

Asymptotically Tight Bounds on the Depth of Estimated Context Trees.

Authors :
Martin, Alvaro
Seroussi, Gadiel
Vitale, Luciana
Source :
IEEE Transactions on Information Theory; Mar2019, Vol. 65 Issue 3, p1793-1806, 14p
Publication Year :
2019

Abstract

We study the maximum Markov model order that can be estimated for an (individual) input sequence $x$ of length $n$ over a finite alphabet of size $\alpha $ , for popular estimators of tree models, where the estimated order is determined by the depth of a context tree estimate, and in the special case of plain Markov models, where the tree is constrained to be perfect (with all leaves at the same depth). First, we consider penalized maximum likelihood estimators where a context tree $\hat {T}$ is obtained by minimizing a cost of the form $-\log \hat {P}_{T}(x) + f(n)|S_{T}|$ , where $\hat {P}_{T}(x)$ is the ML of $x$ under a model with context tree $T$ , $S_{T}$ is the set of leaves of $T$ , and $f(n)$ is an increasing (penalization) function of $n$ (the popular BIC estimator is a special case with $f(n)=\frac {\alpha -1}{2}\log n$). For plain Markov models, a simple argument yields a known upper bound $\overline {k}(n) = O(\log n)$ on the maximum order that can be estimated for $x$ , and we show that, in fact, this simple bound is not far from tight. For general context trees, we derive an asymptotic upper bound, $n^{1/2 - o(1)}$ , on the estimated depth, and we exhibit explicit input sequences that asymptotically attain the bound up to a multiplicative constant factor. We show that a similar upper bound applies also to MDL estimators based on the KT probability assignment and, moreover, the same example sequences asymptotically approach the upper bound also in this case. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00189448
Volume :
65
Issue :
3
Database :
Complementary Index
Journal :
IEEE Transactions on Information Theory
Publication Type :
Academic Journal
Accession number :
134886993
Full Text :
https://doi.org/10.1109/TIT.2018.2889302