Skip to content

Specifying outcome variable in binomial glmm: single responses vs cbind?

2 messages · Malcolm Fairbrother, Ben Bolker

#
Hi Ben,
This thread is relevant in this regard:
https://stat.ethz.ch/pipermail/r-sig-mixed-models/2015q4/024241.html
At least on my machine, I found a substantial difference in the parameter
estimates. The second form seemed more reliable than the first, as you'll
see from the thread.
Do you get the same result?
Best wishes,
Malcolm



Date: Sat, 2 Jul 2016 13:06:30 -0400

  
  
#
Really interesting (and somewhat disconcerting).

  Running it with glmmTMB (which uses Laplace!) gives different results
from glmer with nAGQ=1 -- suggesting some issue not just with Laplace,
but with lme4's implementation thereof?? (I don't think the problem is
an optimization failure ...)
   It makes *some* sense that Gauss-Hermite quadrature would be useful
for this case (since binary data is far from fitting a Normality
assumption), but that doesn't necessarily hold up to scrutiny since what
needs to be approximately Normal is not the likelihood per point, but
the likelihood per conditional mode [which should be the same, up to a
constant, for the aggregated and disaggregated data ...]

  Doug Bates, if you're reading would you be willing to try this out
with MixedModels.jl ... ?

  Ben Bolker
On 16-07-04 02:11 PM, Malcolm Fairbrother wrote: