Variational inference and model selection with generalized evidence bounds

Published

Conference Paper

© 2018 by the Authors. All rights reserved. Recent advances on the scalability and flexibility of variational inference have made it successful at unravelling hidden patterns in complex data. In this work we propose a new variational bound formulation, yielding an estimator that extends beyond the conventional variational bound. It naturally subsumes the importance-weighted and Renyi bounds as special cases, and it is provably sharper than these counterparts. We also present an improved estimator for variational learning, and advocate a novel high signal-to-variance ratio update rule for the variational parameters. We discuss model-selection issues associated with existing evidence-lower-bound-based variational inference procedures, and show how to leverage the flexibility of our new formulation to address them. Empirical evidence is provided to validate our claims.

Duke Authors

Cited Authors

  • Tao, C; Chen, L; Zhang, R; Henao, R; Carin, L

Published Date

  • January 1, 2018

Published In

  • 35th International Conference on Machine Learning, Icml 2018

Volume / Issue

  • 2 /

Start / End Page

  • 1419 - 1435

International Standard Book Number 13 (ISBN-13)

  • 9781510867963

Citation Source

  • Scopus