Information Theory and Estimation Theory

Information theory and estimation theory have generally been regarded as two separate theories with little overlap. Recently, however, it has been recognized that the relations between the two theories are fundamental (e.g., relating the mutual information with the minimum mean-square error) and can indeed be very useful to transfer results from one area to the other. In addition to the intrinsic theoretical interest of such relations, they have already found several applications such as the mercury/waterfilling optimal power allocation over a set of parallel Gaussian channels, a simple proof for the entropy power inequality, a simple proof of the monotonicity of the non-Gaussianness of independent random variables, and the study of extrinsic information of good codes.

We have further explored these connections in the vector Gaussian and arbitrary (non-Gaussian) settings. One interesting application of such a characterization is the efficient computation of the mutual information achieved by a given code over a channel via the symbolwise a posteriori probabilities (which previously could not be computed). We have also considered an alternative definition of information termed lautum information (different from mutual information).

Papers