Skip to content

spatial correlation in lme and huge correlation matrix (memory limit)

2 messages · Sabrina Plante, Milan Bouchet-Valat

#
Hi,

  I'm trying to introduce a (spatial) exponential correlation  
structure (with range=200 and nugget.effet of 0.3) in a lme model of  
this form: lme(ARBUS~YEAR, random=~1|IDSOUS).

The structure of the data is "IDSOUS" "XMIN" "YMAX"   "YEAR"   "ARBUS"  
with 2 years of data and 5600 points for each year.

I do:

corstr<-corExp(value=200, form=~XMIN+YMAX|YEAR, nugget=0.3)

with |year to calculate correlation of IDSOUS(plot) only within the  
same year since their positions did not change between the 2 years.

Then I try to initialize the corExp vector:
Erreur : impossible d'allouer un vecteur de taille 293.2 Mo

The RAM memory of R is too small to deal with my 5600x5599 elements.
So I've tried to use the bigmemory package to deal with these data,  
but nlme package is a SP3 class and a big.matrix object is a SP4  
class, so lme return to incompatible class.

Then I've tried the ff function, but it returns to this message:
Erreur dans UseMethod("Initialize") :
   pas de m?thode pour 'Initialize' applicable pour un objet de classe  
"c('ff_vector', 'ff')"

At this point, I'm out of idea. Is somebody knows how to deal with  
spatial correlation in a mixed-effect model and a huge correlation  
matrix at the same time?

Thanks,

Sabrina
#
Le mercredi 29 ao?t 2012 ? 16:00 -0400, Sabrina Plante a ?crit :
293.2 MB is not that much. What kind of computer do you have - how much
RAM, and please provide the output of sessionInfo()? With a reasonably
recent machine, even 32-bit, you should be able to cope with this.

Also, make sure you do not perform data management tasks before you run
the models: save the exact dataset you need using save(), restart R,
load() it, and directly run the model. The reason is, if you perform
memory hungry tasks, you're going to fragment your RAM and R might not
be able to allocate a large vector in contiguous blocks.

And be sure to follow the thread named "limit on vector size allocation"
which was just started today about the same issue.


My two cents