Differential Entropy of the Conditional Expectation Under Additive Gaussian Noise
- Resource Type
- Periodical
- Authors
- Atalik, A.; Kose, A.; Gastpar, M.
- Source
- IEEE Transactions on Signal Processing IEEE Trans. Signal Process. Signal Processing, IEEE Transactions on. 70:4851-4866 2022
- Subject
- Signal Processing and Analysis
Communication, Networking and Broadcast Technologies
Computing and Processing
Entropy
Gaussian noise
Additives
Noise measurement
Upper bound
Random variables
Probability density function
Differential entropy
conditional mean estimator
exponential family
remote source coding problem
CEO problem
- Language
- ISSN
- 1053-587X
1941-0476
The conditional mean is a fundamental and important quantity whose applications include the theories of estimation and rate-distortion. It is also notoriously difficult to work with. This paper establishes novel bounds on the differential entropy of the conditional mean in the case of finite-variance input signals and additive Gaussian noise. The main result is a new lower bound in terms of the differential entropies of the input signal and the noisy observation. The main results are also extended to the vector Gaussian channel and to the natural exponential family. Various other properties such as upper bounds, asymptotics, Taylor series expansion, and connection to Fisher Information are obtained. Two applications of the lower bound in the remote-source coding and CEO problem are discussed.