A bregman matrix and the gradient of mutual information for vector poisson and gaussian channels

Published

Journal Article

A generalization of Bregman divergence is developed and utilized to unify vector Poisson and Gaussian channel models, from the perspective of the gradient of mutual information. The gradient is with respect to the measurement matrix in a compressive-sensing setting, and mutual information is considered for signal recovery and classification. Existing gradient-of-mutual-information results for scalar Poisson models are recovered as special cases, as are known results for the vector Gaussian model. The Bregman-divergence generalization yields a Bregman matrix, and this matrix induces numerous matrix-valued metrics. The metrics associated with the Bregman matrix are detailed, as are its other properties. The Bregman matrix is also utilized to connect the relative entropy and mismatched minimum mean squared error. Two applications are considered: 1) compressive sensing with a Poisson measurement model and 2) compressive topic modeling for analysis of a document corpora (word-count data). In both of these settings, we use the developed theory to optimize the compressive measurement matrix, for signal recovery and classification. © 1963-2012 IEEE.

Full Text

Duke Authors

Cited Authors

  • Wang, L; Carlson, DE; Rodrigues, MRD; Calderbank, R; Carin, L

Published Date

  • January 1, 2014

Published In

Volume / Issue

  • 60 / 5

Start / End Page

  • 2611 - 2629

International Standard Serial Number (ISSN)

  • 0018-9448

Digital Object Identifier (DOI)

  • 10.1109/TIT.2014.2307068

Citation Source

  • Scopus