Back to Search
Start Over
Extension of de Bruijn's identity to dependent non-Gaussian noise channels
- Source :
- J. Appl. Probab. 53, no. 2 (2016), 360-368
- Publication Year :
- 2016
- Publisher :
- Cambridge University Press (CUP), 2016.
-
Abstract
- De Bruijn's identity relates two important concepts in information theory: Fisher information and differential entropy. Unlike the common practice in the literature, in this paper we consider general additive non-Gaussian noise channels where more realistically, the input signal and additive noise are not independently distributed. It is shown that, for general dependent signal and noise, the first derivative of the differential entropy is directly related to the conditional mean estimate of the input. Then, by using Gaussian and Farlie–Gumbel–Morgenstern copulas, special versions of the result are given in the respective case of additive normally distributed noise. The previous result on independent Gaussian noise channels is included as a special case. Illustrative examples are also provided.
- Subjects :
- Statistics and Probability
54C70
General Mathematics
Gaussian
02 engineering and technology
Farlie–Gumbel–Morgenstern copula
Conditional expectation
Information theory
01 natural sciences
Differential entropy
010104 statistics & probability
symbols.namesake
Statistics
0202 electrical engineering, electronic engineering, information engineering
Applied mathematics
0101 mathematics
Fisher information
62H20
Mathematics
De Bruijn sequence
020206 networking & telecommunications
Noise
Gaussian copula
Gaussian noise
symbols
Statistics, Probability and Uncertainty
Subjects
Details
- ISSN :
- 14756072 and 00219002
- Volume :
- 53
- Database :
- OpenAIRE
- Journal :
- Journal of Applied Probability
- Accession number :
- edsair.doi.dedup.....fdad925a2f273ebb3e5f26cefbe12f72