negative variances
Dear Prof Bates, Many thanks for your email. I tried lmer() and received the following messages:
lm2<-lmer(ppd~month+(month|id))
Warning message: Estimated variance-covariance for factor 'id' is singular in: `LMEoptimize<-`(`*tmp*`, value = list(maxIter = 200, tolerance = 1.49011611938477e-08, I then tried lmer2():
lm3<-lmer2(ppd~month+(month|id)) summary(lm3)
Linear mixed-effects model fit by REML
AIC BIC logLik MLdeviance REMLdeviance
1146 1166 -567.9 1126 1136
Random effects:
Groups Name Variance Std.Dev. Corr
id (Intercept) 0.1566631 0.395807
month 0.0022331 0.047255 1.000
Residual 0.6391120 0.799445
Number of obs: 420, groups: id, 140
Fixed effects:
Estimate Std. Error t value
(Intercept) 6.43595 0.07017 91.72
month -0.36619 0.01642 -22.30
Correlation of Fixed Effects:
(Intr)
month -0.544
However, I am not sure about the results, because MLwiN showed both random
effects were negative values (-0.196 and -0.023).
I start to notice this problems of negative variances when I am learning
how to use structural equation modeling software to run multilevel models
for longitudinal data. To my great surprise, it occurs quite frequently. In
SEM, this problem sometimes may be overcome by estimating a nonlinear
model by freeing the factor loadings. For example, in this data, PPD
(probing pocket depth) was measured three times at month 0, 3 and 6. I only
fixed the first and last factor loadings to be 0 and 6 to get a non-linear
relation, and I also allow the level-1 residuals to be different on each
occasion. However, in some data, I failed to get a satifactory model no
matter how I modified my models.
I looked for the discussion in several multilevel modeling textbooks but
only found one short discussion in the book by Brown and Prescott. SEM
literature usually suggest fixing the negative variances to 0. However, I
wander whether this is the only way to get around this problem or the
sensible way because if the random effects are fixed to 0 the model is no
longer a random effects model.
With best regards,
Yu-Kang
From: "Douglas Bates" <bates at stat.wisc.edu> To: "Tu Yu-Kang" <yukangtu at hotmail.com> CC: r-help at stat.math.ethz.ch, r-sig-mixed-models at r-project.org Subject: Re: [R] negative variances Date: Wed, 11 Apr 2007 09:15:21 -0500 On 4/11/07, Tu Yu-Kang <yukangtu at hotmail.com> wrote:
Dear R experts, I had a question which may not be directly relevant to R but I will be grateful if you can give me some advices. I ran a two-level multilevel model for data with repeated measurements over time, i.e. level-1 the repeated measures and level-2 subjects. I could not get convergence using lme(), so I tried MLwiN, which eventually showed the level-2 variances (random effects for the intercept and slope) were negative values. I know this is known as Heywood cases in the structural equation modeling literature, but the only discussion on this problem in the literature of multilevel models and random effects models I can find is in the book by Prescott and Brown. Any suggestion on how to solve this problem will be highly appreciated.
It is possible that the ML or REML estimates for a variance component can be zero. The algorithm used in lme doesn't perform well in this situation which is one reason that the lmer and lmer2 functions in the lme4 package were created. Could you try fitting the model with those or provide us with the data so we can check it out? I recommend moving this discussion to the R-SIG-mixed-models mailing list which I am copying on this reply.