Skip to content

MemoryProblem in R-1.4.1

2 messages · vito muggeo, Brian Ripley

#
Hi all,
In a simulation context, I'm applying some my function, "myfun" say, to a
list of glm obj, "list.glm":
[1] 1000
[1] "glm" "lm"
[1] 1000

Because length(list.glm) and the sample size are rather large, I've splitted
the list into 10 sub-list, say: list.glm1, list.glm2,....
Now I'm using of course:
out1<-lapply(list.glm1, myfun)
out2<-lapply(list.glm2, myfun)
....
However only the first works, for the second one it is:

Error: cannot allocate vector of size 3 Kb
In addition: Warning message:
Reached total allocation of 255Mb: see help(memory.size)

So I increase the memory
NULL
Error: cannot allocate vector of size 31 Kb
In addition: Warning message:
Reached total allocation of 300Mb: see help(memory.size)

Again I increase the memory.size
NULL
.....
So it seems I have to increase the memory.size each time before applying my
function. This is suprising because I know that returning to the prompt the
memory is fully available again. So being the lists similar, why the same
memory size is not sufficient for every list?

Is there any way to solve this problem or have I to modify memory.size()
after every call.
And if it is so, is there a limit?
Moreover the problem does not depends on the number of simulated samples
(i.e. length(list)), because I'm applying the function on sub-list having
just 100 components.


I'm running R 1.4.1 on WinMe pentium III 750 with 256RAM

By the way  there would be the same problem on Linux?

Thanks for your help,
best,
vito
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
#
On Tue, 30 Apr 2002, vito muggeo wrote:

            
Because the returned objects are still in memory.  My guess is that
out1 etc are large objects: try object.size to see.  I do wonder if a
simple for() loop would not work better.
You could start R with --max-mem-size (and that's better than increasing
memory.limit) but swapping on WinME is likely to be painfully slow.
Somewhere around 1.5Gb, if you have enough swap space.
Very likely, but Linux's swap space works much better than WinME, at least
up to 1Gb.