Skip to content

cannot allocate vector of size 71773 Kb (PR#915)

3 messages · cdhershberger@dow.com, Uwe Ligges, Paul Gilbert

#
Full_Name: Doug Hershberger
Version: 1.2.2
OS: Red Hat-Linux 7.0
Submission from: (NULL) (216.99.65.36)


In the R FAQ I find the following entry explaining that R no longer has   
problems with memory.

http://cran.r-project.org/doc/FAQ/R-FAQ.html#Why%20does%20R%20run%20out%20of%20of%20memory%3f

However in my installation: R Version 1.2.2  (2001-02-26)

Installed from the red hat RPM on your site on a Red Hat 7.0 i686

I get the following error when working with large data sets:

                > source("/usr/local/genex/rcluster/lib/rcluster/r/hcluster.r");
               > breadth.program("uploaded_data.txt", "average", 10)
                Read 2 items
                Read 8574 items
                Error: cannot allocate vector of size 71773 Kb
                Execution halted

Is there a way to fix this? This is running as part of the GeneX
installation so I would have to dig through to figure out how to give R
more memory.

Besides which I don't think that will work.

Thanks for any insight that you can provide.

-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-devel-request@stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
#
cdhershberger@dow.com wrote:
That is NOT A BUG!

Have a look at ?Memory
Start R with --max-mem-size=xxxM

Uwe Ligges
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-devel-request@stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
#
In R 1.2.2 this message means the operating system is not letting R have the memory.
You do not need parameters on the R command line (and it seems better not to have
them) but you do need to do some things in the operating system before you start R.
In Unix/Linux you first need to check that your datasize and stacksize limits are not
set, as they usually are. Use limit or unlimit depending on you shell or OS. You then
need to have adequate swap space. Physical memory will make things faster, but is not
necessary. You may need a very large swap space if you are going to do much with an
80M vector, but perhaps it gets broken into smaller pieces once you get it loaded.

Given current memory prices you should probably consider more physical memory, but I
expect you will need several gigs of virtual memory to do much work with an 80M
vector. Also beware, as I recall, mkswap in Linux defaults to an older format for
swap and the newer format is faster.

Paul Gibert



-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-devel-request@stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._