Back to Search Start Over

The Log-Volume of Optimal Codes for Memoryless Channels, Asymptotically Within A Few Nats

Authors :
Moulin, Pierre
Source :
IEEE Transactions on Information Theory, 2017
Publication Year :
2013

Abstract

Shannon's analysis of the fundamental capacity limits for memoryless communication channels has been refined over time. In this paper, the maximum volume $M_\avg^*(n,\epsilon)$ of length-$n$ codes subject to an average decoding error probability $\epsilon$ is shown to satisfy the following tight asymptotic lower and upper bounds as $n \to \infty$: \[ \underline{A}_\epsilon + o(1) \le \log M_\avg^*(n,\epsilon) - [nC - \sqrt{nV_\epsilon} \,Q^{-1}(\epsilon) + \frac{1}{2} \log n] \le \overline{A}_\epsilon + o(1) \] where $C$ is the Shannon capacity, $V_\epsilon$ the $\epsilon$-channel dispersion, or second-order coding rate, $Q$ the tail probability of the normal distribution, and the constants $\underline{A}_\epsilon$ and $\overline{A}_\epsilon$ are explicitly identified. This expression holds under mild regularity assumptions on the channel, including nonsingularity. The gap $\overline{A}_\epsilon - \underline{A}_\epsilon$ is one nat for weakly symmetric channels in the Cover-Thomas sense, and typically a few nats for other symmetric channels, for the binary symmetric channel, and for the $Z$ channel. The derivation is based on strong large-deviations analysis and refined central limit asymptotics. A random coding scheme that achieves the lower bound is presented. The codewords are drawn from a capacity-achieving input distribution modified by an $O(1/\sqrt{n})$ correction term.<br />Comment: 75 pages, 8 figures. This is the final version to appear in the IEEE Transactions on Information Theory, 2017

Details

Database :
arXiv
Journal :
IEEE Transactions on Information Theory, 2017
Publication Type :
Report
Accession number :
edsarx.1311.0181
Document Type :
Working Paper