Skip to content
Prev 1878 / 398506 Next

Transferring data from S+ to R

In response to my claim that both S and R are hopeless for large data
sets, Jim Lindsey has pointed out, I am sure he won't mind me repeating
here:
Mea culpa, I was still thinking in S. You'll have no doubt seen the
recent posts on S-news about the shenannigans needed to get Splus to
function at all with large data sets.

In correspondence not on this list, Prof Brian Ripley wondered why
anybody would want to switch from Splus to R, given that it was not even
seriously faster in benchmarks, and obviously lacks many features. My
quick reply to that was: price. On reflection though,
using the scoping to avoid needless copies is the real reason I started
*using* R in earnest as an alternative to Splus, i.e. even on a machine
that has Splus. I have run stuff that will not work in Splus unless I
had gigabytes of RAM, but which can work in R using only default memory
settings. Not that I have got it quite right yet though; I still haven't
worked out how to make, for example,

var1 <- rnorm(10000)

not create a new instance, eating 10000 globs of memory, every time it
is called within a loop; though clearly the temporary that is passed to
and overwrites var1 could be freed up as soon as the assignment is made.
Or does R think that var1 is a new variable every time the parser hits
this line? Would it not then be possible to check for and free up the
'old' var1 space, straight away?

Simon Fear

-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._