Skip to content

Trouble with the memory allocation

3 messages · Brian Ripley, Laurent Gautier

#
Dear R-users,


I am currently facing what appears to be a strange thing (at least to my
humble understanding).

If  I understood correctly, starting with the version 1.2.3, R memory
allocation can be done dynamically,
and there is no need to fiddle with the --nsize and --vsize parameter
any longer...

So far this everything seemed to go this way (I saw the size of my
processes growing when I was using big objects and
so on). Howver recently I had trouble with the memory. It seems there is
a limit of about 1,2 Go, beyond which R starts
to send memory allocation error messages...  not consistent with the
memory still available
(like 'Error: cannot allocate vector of size 125382 Kb', while there
still about 17Go free).

I thought default limitation were set, but it does not seem to be the
case
nsize vsize
   NA    NA


Any idea ?

Where am I wrong ?



Laurent


PS: I am currently using R-1.3.0-patched, compiled on SGI IRIX 6.5 (I
was using 1.2.3 and had the same kind of problems, that's why
I upgraded)



--
Laurent Gautier                 CBS, Building 208, DTU
PhD. Student                    D-2800 Lyngby,Denmark
tel: +45 45 25 24 85            http://www.cbs.dtu.dk/laurent


-------------- next part --------------
An HTML attachment was scrubbed...
URL: https://stat.ethz.ch/pipermail/r-help/attachments/20010716/3516f1da/attachment.html
#
On Mon, 16 Jul 2001, Laurent Gautier wrote:

            
starting with 1.2.0, yes.
Is this compiled as 32-bit or 64-bit process?  And are there any
per-process limits set?  A 32-bit process will (by definition) have a
4Gb limit, and may well have 2Gb or less depending how the malloc is
organised.
That's correct.  See ?mem.limits.
No change in that area since 1.2.3.

  
    
#
Prof Brian D Ripley wrote:

            
This was compiled with the gnu compilers suite (gcc, g77, f77 and gmake). If
I remember well,
gcc only compile 32 bits stuff (but I may be completely wrong)
Mmmmh....
I just tried a quick recursive perl hack and it seems the memory limit for a
process is around 1.7Go...

I also talked to one of system administrators here, and he is not aware of
malloc specifc organisation, neither of a per-process limit (and the man page
for malloc does not tell me more)...





Laurent





--
Laurent Gautier                 CBS, Building 208, DTU
PhD. Student                    D-2800 Lyngby,Denmark
tel: +45 45 25 24 85            http://www.cbs.dtu.dk/laurent


-------------- next part --------------
An HTML attachment was scrubbed...
URL: https://stat.ethz.ch/pipermail/r-help/attachments/20010716/010ab14e/attachment.html