Skip to content
Prev 14321 / 20628 Next

Back-transformation of Poisson model

Hi Mollie,

In simple models it is the sum of the variance components, although with 
things such as random regression it gets a bit more complicated.  For 
log-link Poisson models the mean is simply the mean of a log-normal. For 
logit models there is not an anlytical solution, and the approximation 
in the CourseNotes I certainly got from somewhere but can't now remember 
where from: somebody else has pointed out that my McCulloch & Searle 
citation does not have the relevant material so I will try and hunt down 
where the result comes from.

I think the deviation of your predicted and actual mean arises because 
of sampling error in the intercept/variance. The mean of an 
exponentiated sum of two normals will be larger than the exponentiated 
sum of the means of two normals. Consequently an unbiased estimator of 
the intercept and variance would lead to an upwardly biased estimator of 
the data-scale mean. Not sure about this though! Perhaps see if the 
predicted and actual mean become closer if you up the number of random 
effect levels (and so the intercept/variance become more precisely 
estimated).

Regarding Q2 I'm also not sure. In predict.MCMCglmm you can specifiy the 
random effects you want to marginalise over, but I *think* this is not 
implemented in lme4?

Cheers,

Jarrod
On 14/03/2016 16:34, Mollie Brooks wrote: