Skip to content

Generalized least squares program

8 messages · Steven McKinney, Douglas Bates, Gang Chen +1 more

#
I'm using function gls in nlme to run some extended linear model with  
generalized least squares many many times in a loop with the same  
design matrix X and same residual covariance structure but different  
values for the response variable. A lot of time is wasted in  
reformulating the same model structures and overhead matrix  
calculations. Such a waste makes the runtime range from one day to  
one month. Are there some slots available in gls that would allow me  
to avoid the unnecessary waste? If not, is there a counterpart of gls  
for extended linear model with generalized least squares available in  
lme4 that would give such slots?

Thanks,
Gang
#
Hi Gang, 

Have you tried the predict() function?
You should be able to hand off new data
to your saved fitted model object.  
See 
?predict.gls
for an example.

HTH

Steve McKinney

-----Original Message-----
From: r-sig-mixed-models-bounces at r-project.org on behalf of Gang Chen
Sent: Wed 7/16/2008 8:21 AM
To: r-sig-mixed-models at r-project.org
Subject: [R-sig-ME] Generalized least squares program
 
I'm using function gls in nlme to run some extended linear model with  
generalized least squares many many times in a loop with the same  
design matrix X and same residual covariance structure but different  
values for the response variable. A lot of time is wasted in  
reformulating the same model structures and overhead matrix  
calculations. Such a waste makes the runtime range from one day to  
one month. Are there some slots available in gls that would allow me  
to avoid the unnecessary waste? If not, is there a counterpart of gls  
for extended linear model with generalized least squares available in  
lme4 that would give such slots?

Thanks,
Gang

_______________________________________________
R-sig-mixed-models at r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-sig-mixed-models
#
Hi Gang,

Sorry, I should have said "update()" instead
of "predict()".

However, a timing test on a small example shows no
considerable gain to updating an existing object
compared to refitting a new object.  This may be different
for your scenario.  If not, then it's an exercise of
diving into the code base to figure out how to adapt
the update.gls() function or related fitting
functions so you can run your extra fits without
recomputing everything.
+            correlation = corAR1(form = ~ 1 | Mare))
+            correlation = corAR1(form = ~ 1 | Mare))
[1] TRUE
+             for(i in 1:100) {
+               fm2 <- update(fm1, model. = jitterFollicles ~ .)
+             }
+             )
   user  system elapsed 
 12.022   5.238  17.264
+             for(i in 1:100) {
+               fm3 <-  gls(jitterFollicles ~ sin(2*pi*Time) + cos(2*pi*Time), Ovary,
+                           correlation = corAR1(form = ~ 1 | Mare))
+             }
+             )
   user  system elapsed 
 11.898   5.227  17.126
Steve McKinney


-----Original Message-----
From: Gang Chen [mailto:gangchen at mail.nih.gov]
Sent: Wed 7/16/2008 11:34 AM
To: Steven McKinney
Cc: r-sig-mixed-models at r-project.org
Subject: Re: [R-sig-ME] Generalized least squares program
 
Thanks for the suggestion!

It seems to me predict() is the opposite of what I'm looking for.  
With an extended linear model

Y = X*b + e, e ~ N(0, R s^2), R s^2 is the  covariance matrix of the  
residual term e

both X and the structure of R remain the same across all the loops in  
my case, and Y is different and so are the estimates of b and R s^2  
from one iteration to another.

Gang
On Jul 16, 2008, at 1:29 PM, Steven McKinney wrote:

            
#
No, update() doesn't make much difference in my case either. I've  
been reluctant to dive into the source code, but I may have to do so  
if no better choices turn up.

Thanks again,
Gang
On Jul 16, 2008, at 4:32 PM, Steven McKinney wrote:

            
#
In general update applied to a fitted model object modifies the
original call then evaluates the modified call.  Hence there will be
no gain in speed.  The refit function in the lme4 package replaces the
response and refits the model with the new response.  If there was
such a function for a gls object that would be what you would want to
use.  However, there is no refit method for gls objects.
On Wed, Jul 16, 2008 at 4:23 PM, Gang Chen <gangchen at mail.nih.gov> wrote:
#
Dr. Bates,

Thanks a lot for the clarification!

Is there any plan to have a gls method implemented in lme4 so that I  
could take advantage of the refit function?

Also does lmer in lme4 allow for specifying covariance structure for  
the residuals? If yes, is there a way to trick lmer so that I can run  
gls by somehow zeroing the across-group random effect Z*b in model y  
= X*beta + Z*b + e?

Thanks,
Gang
On Jul 17, 2008, at 12:40 PM, Douglas Bates wrote: