Posterior computation with the Gibbs zig-zag sampler
Markov chain Monte Carlo (MCMC) sampling algorithms have dominated the literature on posterior computation. However, MCMC faces substantial hurdles in performing efficient posterior sampling for challenging Bayesian models, particularly in high-dimensional and large data settings. Motivated in part by such hurdles, an intriguing new class of piecewise deterministic Markov processes (PDMPs) has recently been proposed as an alternative to MCMC. One of the most popular types of PDMPs is known as the zig-zag (ZZ) sampler. Such algorithms require a computational upper bound in a Poisson thinning step, with performance improving for tighter bounds. In order to facilitate scaling to larger classes of problems, we propose a general class of Gibbs zig-zag (GZZ) samplers. GZZ allows parameters to be updated in blocks with ZZ applied to certain parameters and traditional MCMC style updates to others. This provides a flexible framework to combine PDMPs with the rich literature on MCMC algorithms. We prove appealing theoretical properties of GZZ and demonstrate it on posterior sampling for logistic models with shrinkage priors for high-dimensional regression and random effects.