## Lognormal and gamma mixed negative binomial regression

In regression analysis of counts, a lack of simple and efficient algorithms for posterior computation has made Bayesian approaches appear unattractive and thus underdeveloped. We propose a lognormal and gamma mixed negative binomial (NB) regression model for counts, and present efficient closed-form Bayesian inference; unlike conventional Poisson models, the proposed approach has two free parameters to include two different kinds of random effects, and allows the incorporation of prior information, such as sparsity in the regression coefficients. By placing a gamma distribution prior on the NB dispersion parameter r, and connecting a log-normal distribution prior with the logit of the NB probability parameter p, efficient Gibbs sampling and variational Bayes inference are both developed. The closed-form updates are obtained by exploiting conditional conjugacy via both a compound Poisson representation and a Polya-Gamma distribution based data augmentation approach. The proposed Bayesian inference can be implemented routinely, while being easily generalizable to more complex settings involving multivariate dependence structures. The algorithms are illustrated using real examples. Copyright 2012 by the author(s)/owner(s).

### Duke Scholars

## Published In

## Publication Date

## Volume

## Start / End Page

### Citation

*Proceedings of the 29th International Conference on Machine Learning, ICML 2012*,

*2*, 1343–1350.

*Proceedings of the 29th International Conference on Machine Learning, ICML 2012*2 (October 10, 2012): 1343–50.

*Proceedings of the 29th International Conference on Machine Learning, ICML 2012*, vol. 2, Oct. 2012, pp. 1343–50.