On Mon, 16 Jul 2001, Laurent Gautier wrote:
Dear R-users, I am currently facing what appears to be a strange thing (at least to my humble understanding). If I understood correctly, starting with the version 1.2.3, R memory allocation can be done dynamically, and there is no need to fiddle with the --nsize and --vsize parameter any longer... So far this everything seemed to go this way (I saw the size of my processes growing when I was using big objects and so on). Howver recently I had trouble with the memory. It seems there is a limit of about 1,2 Go, beyond which R starts to send memory allocation error messages... not consistent with the memory still available (like 'Error: cannot allocate vector of size 125382 Kb', while there still about 17Go free).
There is an upper limit on the memory size because some internal objects in the memory manager are stored as ints (even on a 64-bit system this limits you, I think, to 4Gb, but it may be 2Gb).
It shouldn't be too hard to expand these limits for 64-bit systems. There is a much firmer limit in the design of R to no more than 2^31 objects, and to objects of maximum length 2^31, but this would still allow multigigabyte workspaces.
-thomas
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._