Skip to content

"cannot allocate vector of size ..." in RHLE5 PAE kernel

5 messages · Martin Maechler, Mauricio Zambrano-Bigiarini, Hugo Mildenberger

#
Dear R community,

I'm running R 32 bits in a 64-bits machine (with 16Gb of Ram) using a
PAE kernel, as you can see here:

$ uname -a
Linux mymachine 2.6.18-238.el5PAE #1 SMP Sun Dec 19 14:42:44 EST 2010
i686 i686 i386 GNU/Linux


When I try to create a large matrix ( Q.obs <- matrix(NA, nrow=6940,
ncol=9000) ), I got the following error:
However, the amount of free memory in my machine seems to be much
larger than this:

system("free")
\             total       used       free     shared    buffers     cached
Mem:      12466236    6354116    6112120          0      67596    2107556
-/+ buffers/cache:    4178964    8287272
Swap:     12582904          0   12582904


I tried to increase the memory limit available for R by using:

$ R --min-vsize=10M --max-vsize=5000M --min-nsize=500k --max-nsize=5000M


but it didn't work.


Any hint about how can I get R using all the memory available in the machine ?


Thanks in advance,

Mauricio
#
MZ> Dear R community,
    MZ> I'm running R 32 bits in a 64-bits machine (with 16Gb of Ram) using a
    MZ> PAE kernel, as you can see here:

    MZ> $ uname -a
    MZ> Linux mymachine 2.6.18-238.el5PAE #1 SMP Sun Dec 19 14:42:44 EST 2010
    MZ> i686 i686 i386 GNU/Linux


    MZ> When I try to create a large matrix ( Q.obs <- matrix(NA, nrow=6940,
    MZ> ncol=9000) ), I got the following error:


    >> Error: cannot allocate vector of size 238.3 Mb


    MZ> However, the amount of free memory in my machine seems to be much
    MZ> larger than this:

    MZ> system("free")
    MZ> \             total       used       free     shared    buffers     cached
    MZ> Mem:      12466236    6354116    6112120          0      67596    2107556
    MZ> -/+ buffers/cache:    4178964    8287272
    MZ> Swap:     12582904          0   12582904


    MZ> I tried to increase the memory limit available for R by using:

    MZ> $ R --min-vsize=10M --max-vsize=5000M --min-nsize=500k --max-nsize=5000M


    MZ> but it didn't work.


    MZ> Any hint about how can I get R using all the memory available in the machine ?

Install a 64-bit version of Linux, i.e., ubuntu in your case
and work from there.
I don't think there's a way around that.

Martin
#
Thanks for your answer Martin, but -unfortunately- the decision about
installing a 32 bits OS in the 64 bits machine, was taken by the IT
guys of my work and not by me.

By the way, due to strong limitations about software installation in
my work place, this problem didn't happen in Ubuntu, but in Red Hat
Enterprise 5. At home I have Ubuntu 10.10 32 bits, but I can not run
the code I need in that machine.


Cheers,

Mauricio
#
Following the advice a colleague, I put the gc() and gcinfo(TRUE)
commands just before the line I got the problem, and their output
were:

                 used (Mb) gc trigger  (Mb)  max used   (Mb)
Ncells  471485 12.6    1704095  45.6   7920371  211.5
Vcells 6408885 48.9  113919753 869.2 347651599 2652.4

Garbage collection 538 = 323+101+114 (level 2) ...
13.0 Mbytes of cons cells used (29%)
49.0 Mbytes of vectors used (7%)

Error: cannot allocate vector of size 238.1 Mb


If I understood correctly, I should have enough memory for allocating
the new matrix (Q.obs <- matrix(NA, nrow=6940,  MZ> ncol=9000) ))

Thanks in advance for any help,

Mauricio
#
Mauricio,

I tried your matrix allocation on Gentoo-hardened 32 and 
64 bit systems. Both work ok, using R-2.11.1 and R-2.12.2 respectively,
and both use a recent 2.6.36 kernel revision.

This is from the 32 bit system with 512 MB physical memory:
total       used       free     shared    buffers     cached
Mem:        469356      61884     407472          0       1368      21592
-/+ buffers/cache:      38924     430432
Swap:      1927796      36096    1891700
used (Mb) gc trigger (Mb) max used (Mb)
Ncells   120116  3.3     350000  9.4   350000  9.4
Vcells     78413  0.6     786432  6.0   391299  3.0
used  (Mb)    gc trigger  (Mb)  max used  (Mb)
Ncells     120123     3.3       350000     9.4     350000    9.4
Vcells 31308414 238.9   34854943 266.0 31308428 238.9
total       used       free     shared    buffers     cached
Mem:        469356     307528     161828          0       1404      22508
-/+ buffers/cache:     283616     185740
Swap:      1927796      36084    1891712


MZ> I tried to increase the memory limit available for R by using:
MZ> $ R --min-vsize=10M --max-vsize=5000M --min-nsize=500k --max-nsize=5000M

Hmm, I wonder if specifying 5000M is a good idea within a 32-bit environment. 
Depending on R's internal implementation, maybe that value could overflow an 
tacitly wrap around on a 32 bit integer. (5000M > 2^32 - 1)  You may try to specify 
1000M instead. But I think it's more probable that the system or VM configuration 
had setup a memory usage limit per user or per process. How to view/change this 
on redhat I don't know. But you may try to compile a small C programm using
malloc() and see what happens if you request say 1Gigabyte:

#include <stdlib.h>
#include <stdio.h>

void main() {
     const size_t size = 1000000000LU;
     void* p = malloc(size);
     if ( p ) {
          fprintf(stderr,"successfully allocated %lu bytes\n",size);
     }else {
          fprintf(stderr,"allocation of %lu bytes failed:%m\n",size);
     }
}

put this into a file named, say, "tmalloc.c" and compile it using
   
      gcc tmalloc.c -o tmalloc

Hugo
On Monday 17 January 2011 16:42:43 Mauricio Zambrano wrote: