Skip to main content

Mutual Information as a Function of Matrix SNR for Linear Gaussian Channels

Publication ,  Conference
Reeves, G; Pfister, HD; Dytso, A
Published in: IEEE International Symposium on Information Theory - Proceedings
August 15, 2018

This paper focuses on the mutual information and minimum mean-squared error (MMSE) as a function a matrix-valued signal-to-noise ratio (SNR) for a linear Gaussian channel with arbitrary input distribution. As shown by Lamarca, the mutual-information is a concave function of a positive semidefinite matrix, which we call the matrix SNR. This implies that the mapping from the matrix SNR to the MMSE matrix is decreasing monotone. Building upon these functional properties, we start to construct a unifying framework that provides a bridge between classical information-theoretic inequalities, such as the entropy power inequality, and interpolation techniques used in statistical physics and random matrix theory. This framework provides new insight into the structure of phase transitions in coding theory and compressed sensing. In particular, it is shown that the parallel combination of linear channels with freely-independent matrices can be characterized succinctly via free convolution.

Duke Scholars

Published In

IEEE International Symposium on Information Theory - Proceedings

DOI

ISSN

2157-8095

Publication Date

August 15, 2018

Volume

2018-June

Start / End Page

1754 / 1758
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Reeves, G., Pfister, H. D., & Dytso, A. (2018). Mutual Information as a Function of Matrix SNR for Linear Gaussian Channels. In IEEE International Symposium on Information Theory - Proceedings (Vol. 2018-June, pp. 1754–1758). https://doi.org/10.1109/ISIT.2018.8437326
Reeves, G., H. D. Pfister, and A. Dytso. “Mutual Information as a Function of Matrix SNR for Linear Gaussian Channels.” In IEEE International Symposium on Information Theory - Proceedings, 2018-June:1754–58, 2018. https://doi.org/10.1109/ISIT.2018.8437326.
Reeves G, Pfister HD, Dytso A. Mutual Information as a Function of Matrix SNR for Linear Gaussian Channels. In: IEEE International Symposium on Information Theory - Proceedings. 2018. p. 1754–8.
Reeves, G., et al. “Mutual Information as a Function of Matrix SNR for Linear Gaussian Channels.” IEEE International Symposium on Information Theory - Proceedings, vol. 2018-June, 2018, pp. 1754–58. Scopus, doi:10.1109/ISIT.2018.8437326.
Reeves G, Pfister HD, Dytso A. Mutual Information as a Function of Matrix SNR for Linear Gaussian Channels. IEEE International Symposium on Information Theory - Proceedings. 2018. p. 1754–1758.

Published In

IEEE International Symposium on Information Theory - Proceedings

DOI

ISSN

2157-8095

Publication Date

August 15, 2018

Volume

2018-June

Start / End Page

1754 / 1758