On Mon, 15 May 2006, m.vroonhoven at erasmusmc.nl wrote:
Dear R developers,
We have a big SGI Origin computation server with 32 cpu's and 64 Gb of
RAM. In R 2.0.0 we could run large jobs, allocating 8 Gb of RAM was not
a problem, for example by running:
v1 <- seq(1,2^29)
v2 <- seq(1,2^29)
v3 <- seq(1,2^29)
v4 <- seq(1,2^29)
This yields an R process, consuming about 8 Gb of RAM:
PID PGRP USERNAME PRI SIZE RES STATE TIME WCPU% CPU% COMMAND
177484 177484 mirjam 20 8225M 8217M sleep 1:18 29.3 0.00 R
After upgrading from R 2.0.0 to R 2.2.1, we cannot allocate more than
about 1300 M of memory, as shown below:
Error: cannot allocate vector of size 2097152 Kb
v1 <- seq(1,2^28)
v2 <- seq(1,2^27)
Error: cannot allocate vector of size 524288 Kb
v2 <- seq(1,2^25)
v3 <- seq(1,2^24)
v4 <- seq(1,2^23)
v5 <- seq(1,2^22)
Error: cannot allocate vector of size 16384 Kb
v5 <- seq(1,2^21)
v6 <- seq(1,2^20)
v7 <- seq(1,2^19)
v8 <- seq(1,2^18)
q()
Save workspace image? [y/n/c]: n
Upgrading to R 2.3.0 yields the same results.
This yields an R executable taking 1284M of RAM, refusing to allocate
more RAM, with about 30Gb free on the machine.
You can tell if you have a 64bit build of R by looking at
.Machine$sizeof.pointer in R, which should be 8.