On Fri, 27 May 2016, Dr Didier G. Leibovici wrote:
Hi,
I guess this may be not a specificity of r-sig-geo but as I am using
library(rgdal)
library(rgeos)
in this script so it may be the reason? (perhaps I should try running
something else to check).
Hi Didier,
Trying a bare bones script may be sensible. There shouldn't be
anything in those packages that creates these effects as such. How
many cores are running R simultaneously? There shouldn't be anything
OSX-specific either, though memory management varies across platforms
(there was a recent discussion on R-devel about this).
So if you could share such a bare-bones script and simulated data
setup, others might be able to contribute.
Best wishes,
Roger
So basically I am running some code reading 61288 features and other
things ... if I run it once I got in gc():
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 1833926 98.0 5103933 272.6 9968622 532.4
Vcells 2437534 18.6 7056348 53.9 11036325 84.3
and on the monitor it says R is using 3.3Go.
Then I remove everything rm(list=ls()) and run it again trying different
sets of parameters for example.
Second run similar gc() but R is using 6.4Go
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 1834325 98.0 6323353 337.8 9968622 532.4
Vcells 2439267 18.7 7572947 57.8 11832730 90.3
After a while and few other computation I get R is using 10Go
and gc() gives
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 1863272 99.6 5944937 317.5 9968622 532.4
Vcells 2503503 19.2 8462995 64.6 100858653 769.5
rm(list=ls())
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 608451 32.5 4755949 254.0 9968622 532.4
Vcells 760517 5.9 6770396 51.7 100858653 769.5
but still 10Go for R in the monitor.
I had experienced a building up to 50Go then my system tells me to close
some apps, all that doing the same running one set, then rm(list=ls())
... So at the moment I just have to close and re-start R?
Didier