Skip to content

Lmer and variance-covariance matrix

10 messages · Antoine, Douglas Bates, Jarrod Hadfield +4 more

7 days later
#
On Thu, Mar 3, 2011 at 7:03 AM, Antoine Paccard
<antoine.paccard at unine.ch> wrote:
If there are 15 levels of trait you are trying to estimate 240
variance-covariance parameters (120 for fam1 and 120 for fam1:id1).
That is a very large optimization problem,  I'm not surprised that
there is difficulty in finding the optimum.
#
Hi,

In addition, each trait is only measured once for each id (correct?)  
which means that the likelihood could not be optimised even if the  
data-set was massive. If you could fix the residual variance to some  
value (preferably zero) then the problem has a unique solution given  
enough data, but I'm not sure this can be done in lmer? . Since the  
structure of the residuals is at the moment quite inflexible you  
probably can't use lmer to fit multi-response models, unless the  
responses are non-Gaussian and non-binary.

Cheers,

Jarrod
On 11 Mar 2011, at 13:18, Douglas Bates wrote:

            

  
    
#
On 12/03/11 02:56, Jarrod Hadfield wrote:
<SNIP>

I think that it ***CANNOT*** be done.  I once asked Doug about
the possibility of this, and he ignored me.  As people so often
do. :-) Especially when I ask silly questions .....

     cheers,

         Rolf Turner
#
On Fri, Mar 11, 2011 at 2:37 PM, Rolf Turner <r.turner at auckland.ac.nz> wrote:
Did Doug really ignore you or did he say that the methods in lmer are
based on determining the solution to a penalized linear least squares
problem so they can't be applied to a model that has zero residual
variance.  Also the basic parameterization for the variance-covariance
matrix of the random effects is in terms of the relative standard
deviation (\sigma_1/\sigma) which is problematic when \sigma is zero.

(My apologies if I did ignore you, Rolf.  I get a lot of email and
sometimes such requests slip down the stack and then get lost.  I'm
very good at procrastinating about the answers to such questions.)
#
On 12/03/11 09:45, Douglas Bates wrote:
Yes, you really did ignore me.  But not to worry; I'm used to it! :-)
I also (more recently) asked Ben Bolker about this issue.  He
ignored me too!  At that stage I kind of took the hint ......

Your explanation of why it can't be done makes perfect sense.

However I find this constraint sad, because I like to be able to
fit ``marginal case'' models, which can also be fitted in a more
simple-minded manner and compare the results from the
simple-minded procedure with those from the sophisticated
procedure.  If they agree, then this augments my confidence
that I am implementing the sophisticated procedure correctly.

An example of such, relating to the current discussion, is a
simple repeated measures model with K (repeated) observations
on each of N subjects, with the within-subject covariance matrix
being an arbitrary positive definite K x K matrix.

This could be treated as a mixed model (if it were possible to
constrain the residual variance to be 0).  It can also be treated
as a (simple-minded) multivariate model --- N iid observations
of K-dimensional vectors, the mean and covariance matrix of
these vectors to be estimated.

I would have liked to be able to compare lmer() results with the
(trivial) multivariate analysis estimates.  To reassure myself that
I was understanding lmer() syntax correctly.

     cheers,

         Rolf
#
On 03/11/2011 05:11 PM, Rolf Turner wrote:
I want to distinguish two cases here.
  It would be reasonable (although I think not currently feasible) to
fix any variance parameter *other than the residual variance* to zero,
refitting the model with that constraint, to do a test of the effect of
that parameter (keeping in mind the various limitations of finite sample
sizes/unknown null distributions, testing on the boundary, blah blah
blah).  This is a special case of the machinery that has to be built in
order for profiling of the random effects to work, and at a pinch you
can get the answer (if you can get lme4a to run on your system) by
fitting the profile and looking at the value at the boundary.
  It is reasonable, but not within the framework of lme4, to set one
particular random effect -- the residual variance -- to zero, because
(as Doug points out) it has a special role.

  cheers
    Ben Bolker
1 day later
#
Dear Antoine:

This is interesting, but...

On Thu, Mar 3, 2011 at 7:03 AM, Antoine Paccard
<antoine.paccard at unine.ch> wrote:
Since many people here think your model cannot be fit, but you say it
can be fit with SAS, can we please see your SAS command and the
output?  SAS may be throwing out some parameters from the model
automatically, but lmer does not.

Just guessing :)

PJ
1 day later
#
I sent this on Friday and forgot to copy to list:

Antoine,

If you reshape your data into the form Fam, Y1, Y2, Y3, ... , Y15, then

summary( manova(cbind(Y1,Y2, ... ) ~ Fam ))$SS

produces between and within family sums of squares and products matrices 
(15 x 15). Divide by appropriate number of d.f. to convert these to mean 
square and product matrices, subtract within from between and divide by 
family size to get the estimated matrix of between-family variances and 
covariances.

Is it that simple? Or have I forgotten something?
Paul Johnson wrote: