Skip to content

R and memory

3 messages · Meriema Belaidouni, Brian Ripley, Thomas Lumley

#
Hello
I have some problems to read large data file with R.
can someone tell me why running
R --visze=30M --nsize=2000k
uses in fact 63M?
thank you
meriema

-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
#
On Wed, 10 Jan 2001, Meriema Belaidouni wrote:

            
We need to know more to be able to help with that.
Let me try.  That allocates (modulo a typo) 30Mb of heap and 2 million
cons cells.  I will assume you are using a 32-bit system and a version of
R prior to 1.2.0.  Then a cons cell is 20 bytes, and so the total space is
38.2Mb for those.  So (and I tried it) the workspace is 68.2Mb. However,
depending on your system, not all of that may appear under say top, and in
any case the R process needs another 4Mb or so for code.

The current version of R may well behave differently, and we do suggest
that you do not use those flags any more.
#
On Wed, 10 Jan 2001, Meriema Belaidouni wrote:

            
It potentially uses even more than that. You have asked for 30Mb of vector
heap and 2 million cons cells. Each cons cell takes 32 bytes (I think), so
this would be 90Mb plus whatever the R program itself uses.

	-thomas

Thomas Lumley			Asst. Professor, Biostatistics
tlumley at u.washington.edu	University of Washington, Seattle

-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._