Skip to content

tree in tree package - Error: cannot allocate vector of size 2.0 Gb

3 messages · stephen sefick, Steve Lianoglou, David Winsemius

#
R version 3.0.0 (2013-04-03)
Platform: x86_64-redhat-linux-gnu (64-bit)

locale:
  [1] LC_CTYPE=en_US.utf8       LC_NUMERIC=C
  [3] LC_TIME=en_US.utf8        LC_COLLATE=en_US.utf8
  [5] LC_MONETARY=en_US.utf8    LC_MESSAGES=en_US.utf8
  [7] LC_PAPER=C                LC_NAME=C
  [9] LC_ADDRESS=C              LC_TELEPHONE=C
[11] LC_MEASUREMENT=en_US.utf8 LC_IDENTIFICATION=C

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base


6GB RAM, intel core2 quad, Scientific Linux 6.4

I am using tree in the tree package.  and I get the following error:

Error: cannot allocate vector of size 2.0 Gb

shouldn't I be able to allocate more memory than 2GB?  I am sure that I 
am missing something.  Any help would be greatly appreciated.
kind regards,
#
Hi,
On Fri, May 31, 2013 at 5:38 PM, Stephen Sefick <sas0025 at auburn.edu> wrote:
I don't remember who on the R list had previously put it this way, but
it puts it best, I think:

The 2gb is the straw that broke the camel's back.

R has likely been allocating more and more memory for whatever it's
trying to do and at some point it asked the OS for another 2gb more,
and *bam* ... toasted.

If you look at top (or htop) and monitor the R process as it's
running, I reckon that's what you'll see, too.

HTH,

-stee

--
Steve Lianoglou
Computational Biologist
Bioinformatics and Computational Biology
Genentech
#
On May 31, 2013, at 2:38 PM, Stephen Sefick wrote:

            
I believe that should be read as "unable to allocate contiguous block of memory for a new object of size 2Gb".

The general rule is that you should have at least 3 times the free (non-OS/non-other-running-applications) RAM as the size of your largest objects. You seem to be in violation of that rule.