Back to Search Start Over

Known statistics

Authors :
Zhang, Zhen
Yang, En-hui
Wei, Victor K.
Source :
IEEE Transactions on Information Theory. Jan, 1997, Vol. v43 Issue n1, p71, 21 p.
Publication Year :
1997

Abstract

The problem of redundancy of source coding with respect to a fidelity criterion is considered. For any fixed rate R > 0 and any memoryless source with finite source and reproduction alphabets and a common distribution p, the nth-order distortion redundancy [D.sub.n](R) of fixed-rate coding is defined as the minimum of the difference between the expected distortion per symbol of any block code with length n and rate R and the distortion rate function d(p, R) of the source p. It is demonstrated that for sufficiently large n, [D.sub.n](R) is equal to -([Delta]/[Delta]R)d(p, R) ln n/2n + o(ln n/n), where ([Delta]/[Delta]R)d(p, R) is the partial derivative of d(p, R) evaluated at R and assumed to exist. For any fixed distortion level d > 0 and any memoryless source p, the nth-order rate redundancy [R.sub.n](d) of coding at fixed distortion level d (or by using d-semifaithful codes) is defined as the minimum of the difference between the expected rate per symbol of any d-semifaithful code of length n and the rate-distortion function R(p, d) of p evaluated at d. It is proved that for sufficiently large n, [R.sub.n](d) is upper-bounded by ln n/n + o(ln n/n) and lower-bounded by ln n/2n + o(ln n/n). As a by-product, the lower bound of [R.sub.n](d) derived in this paper gives a positive answer to a recent conjecture proposed by Yu and Speed. Index Terms - Source coding with a fidelity criterion, rate distortion (distortion rate) function, redundancy, random-coding argument, upper joint entropy, lower mutual information.

Details

ISSN :
00189448
Volume :
v43
Issue :
n1
Database :
Gale General OneFile
Journal :
IEEE Transactions on Information Theory
Publication Type :
Academic Journal
Accession number :
edsgcl.19175580