Back to Search Start Over

On Relations between the Relative Entropy and χ2-Divergence, Generalizations and Applications

Authors :
Igal Sason
Tomohiro Nishiyama
Source :
Entropy, Volume 22, Issue 5, Entropy, Vol 22, Iss 563, p 563 (2020)
Publication Year :
2020
Publisher :
Multidisciplinary Digital Publishing Institute, 2020.

Abstract

The relative entropy and chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, their information-theoretic applications, and some generalizations pertaining to the rich class of $f$-divergences. Applications that are studied in this paper refer to lossless compression, the method of types and large deviations, strong~data-processing inequalities, bounds on contraction coefficients and maximal correlation, and the convergence rate to stationarity of a type of discrete-time Markov chains.<br />Comment: Published in the Entropy journal, May 18, 2020. Journal version (open access) is available at https://www.mdpi.com/1099-4300/22/5/563

Details

Language :
English
ISSN :
10994300
Database :
OpenAIRE
Journal :
Entropy
Accession number :
edsair.doi.dedup.....a261dbe480e0f3814556ba9c98b2a129
Full Text :
https://doi.org/10.3390/e22050563