Back to Search
Start Over
Confidence Intervals for the Mutual Information
- Publication Year :
- 2013
-
Abstract
- By combining a bound on the absolute value of the difference of mutual information between two joint probablity distributions with a fixed variational distance, and a bound on the probability of a maximal deviation in variational distance between a true joint probability distribution and an empirical joint probability distribution, confidence intervals for the mutual information of two random variables with finite alphabets are established. Different from previous results, these intervals do not need any assumptions on the distribution and the sample size.<br />Comment: 5 pages, 2 figure
- Subjects :
- Computer Science - Information Theory
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.1301.5942
- Document Type :
- Working Paper