Skip to content

lmer output shows laplace approximation not reml

6 messages · Douglas Bates, Yolande Tra, Joshua Wiley +2 more

#
On Thu, Jul 19, 2012 at 9:20 PM, Yolande Tra <yolande.tra at gmail.com> wrote:
But Ben answered it.  When you specify family="poisson" you are
fitting a generalized linear mixed model.  The parameter estimates
provided for such a model by lme4 are the maximum likelihood
estimates, up to an approximation.  The default approximation is the
Laplace approximation.


 This data has quite complicated design. I did not find any example
that is similar in the literature on lme4. According to the
investigator this is a partial nested design. Counts were collected at
different transects, different depths and different sites at different
times. Time is continuous and assumed to be random, all the others are
categorical fixed where transect is nested within depth which is
nested within site. Definitely the three factors are nested within
each other but based on the the attached files and the table below, it
looks like this a repeated measurement design where time (dive_id) is
nested within the three factor level combination. So far if I am
wrong, please correct me. I believe the main effect is site (b) and
level (a) is nested within depth(b) which in turn is nested within
site(b). dive_id which represents also time is random.
#
Hi Yolande,

It is not clear what REML is with GLMMs.  In LMMs, REML maximizes the
liklihood that only depends only on the variance components by
conditioning on the fixed effects.  AFAIK (Dr. Bates will hopefully
correct me if I am wrong or step in with a more thorough explanation)
it is unclear what the conditional distribution that depends only on
variance components would be with nonlinear models.  The only
approaches I know of that still do something REMLish use iterative
linear approximations (e.g., the glimmix macro did this with repeated
underlying calls to proc mixed in SAS).

So, I do not think it is surprising that with a GLMM (which mixed
effects poisson is), REML does nothing.

Cheers,

Josh
On Fri, Jul 20, 2012 at 9:39 AM, Yolande Tra <yolande.tra at gmail.com> wrote:

  
    
#
Joshua Wiley <jwiley.psych at ...> writes:
[snip ...]
I have added a (short!) discussion of this issue, with some links
to mailing list threads and literature, at:

http://glmm.wikidot.com/faq#reml-glmm
1 day later
#
Hello MCMCglmm experts !

I have a random intercept model for which I am using MCMCglmm with the following type of call

MC1<-MCMCglmm(bly~x1+x2,random=~school,data=dt,family="categorical" ,prior=list(R=list(V=1,fix=1), G=list(G1=list(V=1, nu=0)))
??? ??? , slice=T, nitt=iter, ,burnin=burn )

I was told that this prior specification: ....G=list(G1=list(V=1, nu=0))) is non-informative for the random effect variance. Is this correct ?

One problem I am having with my models is, when I run a null model (with no covariates, only bly~,random=~school) then the posterior mean for random intercept ( the G-structure : ~school) is lower than when I add covariates, although the 95% credible intervals overlap. This happens with several batches of similar data. Is this a cause for concern and could it be related to the prior specification ? 

Thank you !
JK