Skip to content

Memory allocation

1 message · JFRI (Jesper Frickmann)

#
Go download R version 1.8.1 (or later if available). They fixed
something with memory management from 1.8.0 to 1.8.1 which helped me out
of the exact same problem. I think it has to do with memory
fragmentation; R cannot find any chunk big enough for even a small
vector after some time.

Kind regards, 
Jesper Frickmann 
Statistician, Quality Control 
Novozymes North America Inc. 
Tel. +1 919 494 3266
Fax +1 919 494 3460

-----Original Message-----
From: Richards, Thomas [mailto:RichardsTJ2 at UPMC.EDU] 
Sent: Monday, December 22, 2003 3:11 PM
To: 'r-help at stat.math.ethz.ch'
Subject: [R] Memory allocation


Hello:

	I am trying to work with a couple of microarray data sets, using

platform i386-pc-mingw32
arch     i386           
os       mingw32        
system   i386, mingw32  
status                  
major    1              
minor    8.1            
year     2003           
month    11             
day      21             
language R              


In the shortcut for invoking R I have set --max-mem-size=1024M, so that
I get
[1] 1073741824

Below is an example of what keeps happening as I am working. Any
suggestions as to how I can stop running out of mermory?
[1] 502904736
harvard.genes$probeset,]
Error: cannot allocate vector of size 49 Kb
Error: cannot allocate vector of size 49 Kb
+   function(L) L[1]))[-1]
Error: cannot allocate vector of size 49 Kb
[1] 502912536
used  (Mb) gc trigger  (Mb)
Ncells  2586025  69.1    6812252 182.0
Vcells 20108076 153.5   41205530 314.4
[1] 330645720
[1] 3.247408
harvard.genes$probeset,]
Error: cannot allocate vector of size 49 Kb


______________________________________________
R-help at stat.math.ethz.ch mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help