Skip to content
Prev 13046 / 20628 Next

Choosing appropriate priors for bglmer mixed models in blme

Just to follow up on Gelman's Cauchy prior, it seems to work quite well even in glmms. I don't have any theoretical results as of yet, but if you look at the sampling distribution of the fixed effects for any model, they cluster rather nicely. You get "sane" estimates for when no kind of separation is involved, infinite (or convergence failures) for complete/quasi complete separation, and a third group exists with large estimates for when a group contains all 0s or 1s. In the third case, a random effect can perfectly predict for that group, but because they're integrated out the likelihood remains well defined. You'll just get really large estimates of random effects, which then go with large estimates of fixed effects.

So long as you believe that some effect magnitudes for logistic regression pretty much never happen in nature, the Cauchy prior does a good job of pulling the extreme cases back down to earth while leaving the well-estimated ones roughly in place. That being said, using the priors in blme to patch up a data set is really only advised for checking the viability of a model (usually one among many, rapidly fit). After that, using something like MCMCglmm for a fully Bayesian analysis is the way to go.

Vince