Skip to content

problems with allocate memory

2 messages · cumuluss at web.de, Douglas Bates

#
Hi Douglas,

The variable?d? has about 710 levels.

For your other request I tried to fit the suggested model but it was not possible. I tried it with different approaches, first without any interactions and non nonlinear term. It fitted. The object size was about 731571664 bytes. Then I successive made the model more complex. With one two way interaction, with one three way interaction or with the nonlinear term it was slightly the same as before. With five two way interaction always with the nonlinear term the object size went up to 1075643424 bytes. With one additional two way interaction the model won?t fit anymore with the known error.

Perhaps another hint: Yesterday I attempted to fit a much simpler model with lmer, just to see if this works. (mfit=lmer(gr.b ~ f.ag + f.se + o.se + diff + exp.r + kl + (1|f)+(1|o), data=c.data, family=binomial)). It fitted but I could not open mfit. Trying to see only the coefficients also did not work. I saved the image and this one is unfamiliar huge about 1.7 GB.

By the way: After reloading the image some interesting things happened. An error occurred: slot coefs are not an S4 object. It seems to me that it is not possible to save the model results in an R image. Is that right?

I hope this will make your guess clearer. And again thank you very much for help.
Regards
Paul


-------- Original-Nachricht --------
#
On Tue, Dec 20, 2011 at 5:25 PM, <cumuluss at web.de> wrote:
Which is an indication that the fixed-effects model matrix is getting
to be too large.  There are no simple solutions at present.  You may
find that some packages allow you to fit such large models by working
with horizonal chunks of the data and accumulating the result but
extending those to GLMMs would be decidedly non-trivial.
The problem there is that the implicit print(mfit) (which is what I
imagine you mean when you say "could not open") ends up taking copies
of the whole object, which will eat up all your memory.  Development
versions of lme4 may eventually help with that.
It should be possible to save and load such an object but there is
always the problem that when you have an object that is a sizable
fraction of the total available memory then you can get bitten if
something behind the scenes happens to take a copy at some point in
the calculation.  The original design in R for keeping track of when a
copy must be made is not the greatest and, as a result, R is somewhat
conservative when deciding whether or not to copy an object.  Getting
around that limitation would mean reimplementing R, more-or-less from
scratch and Andrew Runnalls is the only person I know who is willing
to embark on that.