Two-moment inequalities for Rényi entropy and mutual information

Conference Paper

© 2017 IEEE. This paper explores some applications of a two-moment inequality for the integral of the r-th power of a function, where 0 < r < 1. The first contribution is an upper bound on the Rényi entropy of a random vector in terms of the two different moments. When one of the moments is the zeroth moment, these bounds recover previous results based on maximum entropy distributions under a single moment constraint. More generally, evaluation of the bound with two carefully chosen nonzero moments can lead to significant improvements with a modest increase in complexity. The second contribution is a method for upper bounding mutual information in terms of certain integrals with respect to the variance of the conditional density. The bounds have a number of useful properties arising from the connection with variance decompositions.

Full Text

Duke Authors

Cited Authors

  • Reeves, G

Published Date

  • August 9, 2017

Published In

Start / End Page

  • 664 - 668

International Standard Serial Number (ISSN)

  • 2157-8095

International Standard Book Number 13 (ISBN-13)

  • 9781509040964

Digital Object Identifier (DOI)

  • 10.1109/ISIT.2017.8006611

Citation Source

  • Scopus