Skip to content

problems with allocate memory

2 messages · cumuluss at web.de, Douglas Bates

#
Hi to everyone,
?
I have been trying to fit a glmm with a binomial error structure. My model is a little bit complex. I have 8 continuous predictor variables one of them as nonlinear term, 5 categorical predictor variables with some three-way interactions between them. Additional I have 3 random effects and one offset variable in the model. Number of obs is greater than 3million.
I?m working with the latest version of R 2.14.0 on a 64 bit windows system with 8Gb ram.
Everything I tried (reducing model complexity, different 64bit PC with even more memory) nothing leads to a fitted model, always the Error occurs: cannot allocate vector of size 2GB.
Is there anything I can do? I would be very grateful for any commentary.

Paul T.
#
On Sun, Dec 18, 2011 at 3:17 PM, <cumuluss at web.de> wrote:
You probably have multiple copies of some large objects hanging
around.  Can you send us the output of

str(myData)

where 'myData' is the name of the model frame containing the data you
are using and the formula for the model you are trying to fit?