Skip to content

memory issues with large data set

3 messages · Christina Yau, James W. MacDonald, roger bos

#
Hi,
 
I am running R 2.0.1.1. on Windows.  It is a Dell Dimension with a 3.2 Ghz Processor and 4Gb RAM. 
 
When using the ReadAffy() function to read in 97 arrays, I get the below error messages:
Error: cannot allocate vector of size 393529
Reached total allocation of 1024Mb: see help(memory.size)
 
When I use the comman "memory.limit(size=4000)" to increase the memory size to the maximum available, I got a "NULL" as a response.
 
I proceeded to re-run ReadAffy().  This time, I only get the first error message.
Error: cannot allocate vector of size 393529
Thank you for your attention,
Christina
#
Christina Yau wrote:
This question concerns a BioC package, so the correct listserv is 
bioconductor at stat.math.ethz.ch, not the R-help listserv. In the future, 
you should direct questions about BioC packages there.

You don't have enough memory to read all 97 arrays into an AffyBatch, 
not to mention doing any further processing on them. You will have to 
use justRMA() or justGCRMA() to process your data.

In addition, I don't think you can access any more than 2 Gb of RAM 
anyway without making some changes. See 2.11 of the Windows FAQ.

HTH,

Jim