Back to Search
Start Over
Fundamental limits for learning hidden Markov model parameters
- Publication Year :
- 2021
-
Abstract
- We study the frontier between learnable and unlearnable hidden Markov models (HMMs). HMMs are flexible tools for clustering dependent data coming from unknown populations. The model parameters are known to be fully identifiable (up to label-switching) without any modeling assumption on the distributions of the populations as soon as the clusters are distinct and the hidden chain is ergodic with a full rank transition matrix. In the limit as any one of these conditions fails, it becomes impossible in general to identify parameters. For a chain with two hidden states we prove nonasymptotic minimax upper and lower bounds, matching up to constants, which exhibit thresholds at which the parameters become learnable. We also provide an upper bound on the relative entropy rate for parameters in a neighbourhood of the unlearnable region which may have interest in itself.<br />Comment: To appear in IEEE Transactions on Information Theory Print ISSN: 0018-9448 Online ISSN: 1557-9654
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2106.12936
- Document Type :
- Working Paper
- Full Text :
- https://doi.org/10.1109/TIT.2022.3213429