Skip to content

R on Mac building up memory usage

3 messages · Didier Leibovici, Roger Bivand, Bede-Fazekas Ákos

#
Hi,

I guess this may be not a specificity of r-sig-geo but as I am using
library(rgdal)
library(rgeos)

in this script so it may be the reason? (perhaps I should try running 
something else to check).


So basically I am running some code reading  61288 features and other 
things ... if I run it once I got in gc():
 > gc()
           used (Mb) gc trigger  (Mb) max used  (Mb)
Ncells 1833926 98.0    5103933 272.6  9968622 532.4
Vcells 2437534 18.6    7056348  53.9 11036325  84.3

and on the monitor it says R is using 3.3Go.

Then I remove everything rm(list=ls()) and run it again trying different 
sets of parameters for example.
Second run similar gc() but R is using 6.4Go
 > gc()
           used (Mb) gc trigger  (Mb) max used  (Mb)
Ncells 1834325 98.0    6323353 337.8  9968622 532.4
Vcells 2439267 18.7    7572947  57.8 11832730  90.3


After a while and few other computation I get R is using 10Go
and gc() gives
 > gc()
           used (Mb) gc trigger  (Mb)  max used  (Mb)
Ncells 1863272 99.6    5944937 317.5   9968622 532.4
Vcells 2503503 19.2    8462995  64.6 100858653 769.5
rm(list=ls())
 > ls()
character(0)
 >gc()
          used (Mb) gc trigger  (Mb)  max used  (Mb)
Ncells 608451 32.5    4755949 254.0   9968622 532.4
Vcells 760517  5.9    6770396  51.7 100858653 769.5

but  still 10Go for R in the monitor.
I had experienced a building up to 50Go then my system tells me to close 
some apps, all that doing the same running one set, then rm(list=ls()) 
... So at the moment I just have to close and re-start R?

Didier
#
On Fri, 27 May 2016, Dr Didier G. Leibovici wrote:

            
Hi Didier,

Trying a bare bones script may be sensible. There shouldn't be anything in 
those packages that creates these effects as such. How many cores are 
running R simultaneously? There shouldn't be anything OSX-specific either, 
though memory management varies across platforms (there was a recent 
discussion on R-devel about this).

So if you could share such a bare-bones script and simulated data setup, 
others might be able to contribute.

Best wishes,

Roger

  
    
#
Hello Didier,
only restarting R can really free up memory. This is a typical issue on 
some operating systems.
"Why is R apparently not releasing memory? [...] This is an artifact of 
the way the operating system (OS) allocates memory. In general it is 
common that the OS is not capable of releasing all unused memory. In 
extreme cases it is possible that even if R frees almost all its memory, 
the OS can not release any of it due to its design and thus tools such 
as ps or top will report substantial amount of resident RAM used by the 
R process even though R has released all that memory. [...] The short 
answer is that this is a limitation of the memory allocator in the 
operating system and there is nothing R can do about it."
See details here:
http://www.hep.by/gnu/r-patched/r-faq/R-FAQ_93.html
Best wishes,
?kos

2016.05.27. 20:20 keltez?ssel, Roger Bivand ?rta: