GIBBS POSTERIOR CONVERGENCE AND THE THERMODYNAMIC FORMALISM
In this paper we consider the posterior consistency of Bayesian inference procedures when the family of models consists of appropriate stochastic processes. Specifically, we suppose that one observes an unknown ergodic process and one has access to a family of models consisting of dependent processes. In this context, we consider Gibbs posterior inference, which is a loss-based generalization of standard Bayesian inference. Our main results characterize the asymptotic behavior of the Gibbs posterior distributions on the space of models. Furthermore, we show that in the case of properly specified models our convergence results may be used to establish posterior consistency. Our model processes are defined via the thermodynamic formalism for dynamical systems, and they allow for a large degree of dependence, including both Markov chains of unbounded orders and processes that are not Markov of any order. This work establishes close connections between Gibbs posterior inference and the thermodynamic formalism for dynamical systems, which we hope will lead to new questions and results in both nonparametric Bayesian analysis and the thermodynamic formalism.
Duke Scholars
Altmetric Attention Stats
Dimensions Citation Stats
Published In
DOI
ISSN
Publication Date
Volume
Issue
Start / End Page
Related Subject Headings
- Statistics & Probability
- 4905 Statistics
- 4901 Applied mathematics
- 0104 Statistics
- 0102 Applied Mathematics
Citation
Published In
DOI
ISSN
Publication Date
Volume
Issue
Start / End Page
Related Subject Headings
- Statistics & Probability
- 4905 Statistics
- 4901 Applied mathematics
- 0104 Statistics
- 0102 Applied Mathematics