Skip to content
Prev 19166 / 20628 Next

Random effects variances in R and SPSS not matching

Without more information, we don't know for sure that the models are the
same in both languages.

It's too much of a time sink for a human to change model details
randomly until the output matches some expected output, but you could
probably do something with genetic programming or simulated annealing to
do that....

But if you can get more information, I would start by making sure
- that the contrasts are truly the same
- assumed covariance structures are the same
- that one language isn't dropping some observations that the other is
keeping (check the reporting number of observations levels of the
grouping var)
- the estimation method is the same across languages (ML,REML; hopefully
SPSS isn't using something like quasi-likelihood)
- different optimizers (if available) give the same  result across
languages (i.e. make sure you're not in a local optimum)
- cross checking the result against yet another software package

For example, cross-checking against lme4 immediately hints that this
model might not be advisable / have a well-defined optimum:
REML=FALSE)
Error: number of observations (=1600) <= number of random effects
(=1600) for term (0 + name | Student); the random-effects parameters and
the residual variance (or scale parameter) are probably unidentifiable

Phillip
On 31/3/21 10:15 pm, Simon Harmel wrote: