Skip to main content

Statistical Guarantees for Transformation Based Models with Applications to Implicit Variational Inference

Publication ,  Conference
Plummer, S; Zhou, S; Bhattacharya, A; Dunson, D; Pati, D
Published in: Proceedings of Machine Learning Research
January 1, 2021

Transformation-based methods have been an attractive approach in non-parametric inference for problems such as unconditional and conditional density estimation due to their unique hierarchical structure that models the data as flexible transformation of a set of common latent variables. More recently, transformation-based models have been used in variational inference (VI) to construct flexible implicit families of variational distributions. However, their use in both nonparametric inference and variational inference lacks theoretical justification. We provide theoretical justification for the use of non-linear latent variable models (NL-LVMs) in non-parametric inference by showing that the support of the transformation induced prior in the space of densities is sufficiently large in the L1 sense. We also show that, when a Gaussian process (GP) prior is placed on the transformation function, the posterior concentrates at the optimal rate up to a logarithmic factor. Adopting the flexibility demonstrated in the non-parametric setting, we use the NL-LVM to construct an implicit family of variational distributions, deemed GP-IVI. We delineate sufficient conditions under which GP-IVI achieves optimal risk bounds and approximates the true posterior in the sense of the Kullback-Leibler divergence. To the best of our knowledge, this is the first work on providing theoretical guarantees for implicit variational inference.

Duke Scholars

Published In

Proceedings of Machine Learning Research

EISSN

2640-3498

Publication Date

January 1, 2021

Volume

130

Start / End Page

2449 / 2457
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Plummer, S., Zhou, S., Bhattacharya, A., Dunson, D., & Pati, D. (2021). Statistical Guarantees for Transformation Based Models with Applications to Implicit Variational Inference. In Proceedings of Machine Learning Research (Vol. 130, pp. 2449–2457).
Plummer, S., S. Zhou, A. Bhattacharya, D. Dunson, and D. Pati. “Statistical Guarantees for Transformation Based Models with Applications to Implicit Variational Inference.” In Proceedings of Machine Learning Research, 130:2449–57, 2021.
Plummer S, Zhou S, Bhattacharya A, Dunson D, Pati D. Statistical Guarantees for Transformation Based Models with Applications to Implicit Variational Inference. In: Proceedings of Machine Learning Research. 2021. p. 2449–57.
Plummer, S., et al. “Statistical Guarantees for Transformation Based Models with Applications to Implicit Variational Inference.” Proceedings of Machine Learning Research, vol. 130, 2021, pp. 2449–57.
Plummer S, Zhou S, Bhattacharya A, Dunson D, Pati D. Statistical Guarantees for Transformation Based Models with Applications to Implicit Variational Inference. Proceedings of Machine Learning Research. 2021. p. 2449–2457.

Published In

Proceedings of Machine Learning Research

EISSN

2640-3498

Publication Date

January 1, 2021

Volume

130

Start / End Page

2449 / 2457