Dear R-users, I am currently facing what appears to be a strange thing (at least to my humble understanding). If I understood correctly, starting with the version 1.2.3, R memory allocation can be done dynamically, and there is no need to fiddle with the --nsize and --vsize parameter any longer... So far this everything seemed to go this way (I saw the size of my processes growing when I was using big objects and so on). Howver recently I had trouble with the memory. It seems there is a limit of about 1,2 Go, beyond which R starts to send memory allocation error messages... not consistent with the memory still available (like 'Error: cannot allocate vector of size 125382 Kb', while there still about 17Go free). I thought default limitation were set, but it does not seem to be the case
mem.limits()
nsize vsize NA NA Any idea ? Where am I wrong ? Laurent PS: I am currently using R-1.3.0-patched, compiled on SGI IRIX 6.5 (I was using 1.2.3 and had the same kind of problems, that's why I upgraded) -- Laurent Gautier CBS, Building 208, DTU PhD. Student D-2800 Lyngby,Denmark tel: +45 45 25 24 85 http://www.cbs.dtu.dk/laurent -------------- next part -------------- An HTML attachment was scrubbed... URL: https://stat.ethz.ch/pipermail/r-help/attachments/20010716/3516f1da/attachment.html