1. Asymptotic analysis of model selection criteria for general hidden Markov models
- Author
-
Alexandros Beskos, Shouto Yonekura, and Sumeetpal S. Singh
- Subjects
Statistics and Probability ,Asymptotic analysis ,Applied Mathematics ,Model selection ,010102 general mathematics ,Bayesian probability ,01 natural sciences ,Statistics::Computation ,General family ,Statistics::Machine Learning ,010104 statistics & probability ,Modeling and Simulation ,Statistics::Methodology ,Applied mathematics ,0101 mathematics ,Akaike information criterion ,Hidden Markov model ,Mathematics - Abstract
The paper obtains analytical results for the asymptotic properties of Model Selection Criteria – widely used in practice – for a general family of hidden Markov models (HMMs), thereby substantially extending the related theory beyond typical ‘i.i.d.-like’ model structures and filling in an important gap in the relevant literature. In particular, we look at the Bayesian and Akaike Information Criteria (BIC and AIC) and the model evidence. In the setting of nested classes of models, we prove that BIC and the evidence are strongly consistent for HMMs (under regularity conditions), whereas AIC is not weakly consistent. Numerical experiments support our theoretical results.
- Published
- 2021
- Full Text
- View/download PDF