Skip to content

lme, corARMA and large data sets

2 messages · Peter Wandeler, Dimitris Rizopoulos

#
I am currently trying to get a "lme" analyses running to correct for the
non-independence of residuals (using e.g. corAR1, corARMA) for a larger data
set (>10000 obs) for an independent (lgeodisE) and dependent variable
(gendis). Previous attempts using SAS failed. In addition we were told by
SAS that our data set was too large to be handled by this procedure anyway
(!!).

SAS script
proc mixed data=raw method=reml maxiter=1000;
model gendis=lgeodisE / solution;
repeated /subject=intercept type=arma(1,1);

So I turned to R. Being a complete R newbie I didn't arrive computing
exactly the same model in R on a reduced data set so far.

R command line (using "dummy" as a dummy group variable)
model.ARMA<-lme(gendis~lgeodisE,correlation=corARMA(p=1,q=1),random=~1|dummy).

Furthermore, memory allocation problems occurred again on my 1GB RAM desktop
during some trials with larger data sets.

Can anybody help?
Cheers,
Peter
#
you should include the 'form' argument in "corARMA()", i.e.,

corARMA(form=~1|dummy, p=1, q=1)


I hope it helps.

Best,
Dimitris

----
Dimitris Rizopoulos
Ph.D. Student
Biostatistical Centre
School of Public Health
Catholic University of Leuven

Address: Kapucijnenvoer 35, Leuven, Belgium
Tel: +32/16/336899
Fax: +32/16/337015
Web: http://www.med.kuleuven.ac.be/biostat/
     http://www.student.kuleuven.ac.be/~m0390867/dimitris.htm


----- Original Message ----- 
From: "Peter Wandeler" <p_wandeler at gmx.ch>
To: <R-help at stat.math.ethz.ch>
Sent: Thursday, April 14, 2005 2:12 PM
Subject: [R] lme, corARMA and large data sets