1. On Relations between the Relative Entropy and χ2-Divergence, Generalizations and Applications
- Author
-
Igal Sason and Tomohiro Nishiyama
- Subjects
Kullback–Leibler divergence ,information contraction ,Computer Science - Information Theory ,method of types ,General Physics and Astronomy ,lcsh:Astrophysics ,02 engineering and technology ,Information theory ,01 natural sciences ,large deviations ,010305 fluids & plasmas ,chi-squared divergence ,0103 physical sciences ,lcsh:QB460-466 ,0202 electrical engineering, electronic engineering, information engineering ,Statistical physics ,Divergence (statistics) ,lcsh:Science ,Contraction (operator theory) ,Mathematics ,Lossless compression ,maximal correlation ,strong data–processing inequalities ,Markov chain ,Markov chains ,relative entropy ,020206 networking & telecommunications ,lcsh:QC1-999 ,Rate of convergence ,Large deviations theory ,lcsh:Q ,lcsh:Physics ,Mathematics - Probability ,f-divergences - Abstract
The relative entropy and chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, their information-theoretic applications, and some generalizations pertaining to the rich class of $f$-divergences. Applications that are studied in this paper refer to lossless compression, the method of types and large deviations, strong~data-processing inequalities, bounds on contraction coefficients and maximal correlation, and the convergence rate to stationarity of a type of discrete-time Markov chains., Comment: Published in the Entropy journal, May 18, 2020. Journal version (open access) is available at https://www.mdpi.com/1099-4300/22/5/563
- Published
- 2020
- Full Text
- View/download PDF