Back to Search
Start Over
A Tight Lower Bound on the Mutual Information of a Binary and an Arbitrary Finite Random Variable in Dependence of the Variational Distance
- Publication Year :
- 2013
-
Abstract
- In this paper a numerical method is presented, which finds a lower bound for the mutual information between a binary and an arbitrary finite random variable with joint distributions that have a variational distance not greater than a known value to a known joint distribution. This lower bound can be applied to mutual information estimation with confidence intervals.<br />Comment: 4 pages, 3 figures
- Subjects :
- Computer Science - Information Theory
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.1301.5937
- Document Type :
- Working Paper