Back to Search Start Over

Computation and Estimation of Generalized Entropy Rates for Denumerable Markov Chains

Authors :
Valérie Girardin
G Ciuperca
Loïck Lhote
Laboratoire de Probabilités, Combinatoire et Statistique (LAPCS)
Université Claude Bernard Lyon 1 (UCBL)
Université de Lyon-Université de Lyon
Laboratoire de Mathématiques Nicolas Oresme (LMNO)
Centre National de la Recherche Scientifique (CNRS)-Université de Caen Normandie (UNICAEN)
Normandie Université (NU)-Normandie Université (NU)
Equipe AMACC - Laboratoire GREYC - UMR6072
Groupe de Recherche en Informatique, Image et Instrumentation de Caen (GREYC)
Centre National de la Recherche Scientifique (CNRS)-École Nationale Supérieure d'Ingénieurs de Caen (ENSICAEN)
Normandie Université (NU)-Normandie Université (NU)-Université de Caen Normandie (UNICAEN)
Normandie Université (NU)-Centre National de la Recherche Scientifique (CNRS)-École Nationale Supérieure d'Ingénieurs de Caen (ENSICAEN)
Normandie Université (NU)
Source :
IEEE Transactions on Information Theory, IEEE Transactions on Information Theory, Institute of Electrical and Electronics Engineers, 2011, 57, pp.4026-4034. ⟨10.1109/TIT.2011.2133710⟩
Publication Year :
2011
Publisher :
HAL CCSD, 2011.

Abstract

International audience; —We study entropy rates of random sequences for general entropy functionals including the classical Shannon and Rényi entropies and the more recent Tsallis and Sharma-Mittal ones. In the first part, we obtain an explicit formula for the entropy rate for a large class of entropy functionals, as soon as the process satisfies a regularity property known in dynamical systems theory as the quasi-power property. Independent and identically distributed sequence of random variables naturally satisfy this property. Markov chains are proven to satisfy it too, under simple explicit conditions on their transition probabilities. All the entropy rates under study are thus shown to be either infinite or zero except at a threshold where they are equal to Shannon or Rényi entropy rates up to a multiplicative constant. In the second part, we focus on the estimation of the marginal generalized entropy and entropy rate for parametric Markov chains. Estimators with good asymptotic properties are built through a plug-in procedure using a maximum likelihood es-timation of the parameter.

Details

Language :
English
ISSN :
00189448
Database :
OpenAIRE
Journal :
IEEE Transactions on Information Theory, IEEE Transactions on Information Theory, Institute of Electrical and Electronics Engineers, 2011, 57, pp.4026-4034. ⟨10.1109/TIT.2011.2133710⟩
Accession number :
edsair.doi.dedup.....946965e6d50168df9960201dbdbf9a2d