Skip to content

heap size trouble

2 messages · Jim Lemon, Douglas Bates

#
karamian wrote:
...I want to load a file that contains 93 thousand raws and 22 colums of
data (essentially float)...

I just had to process over 199000 records with four numeric values.  If
I remember correctly, I used:

--vsize 30M  --nsize 500000

which pretty much ate all the RAM (64M) I had.  Don't forget to "rm" big
data sets before you exit, or R will bomb when you next try to load
without the increased memory.  Just reread from the data file when you
need them again (and it helps to exit other apps before starting R to
avoid disk thrashing).

Jim

-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
#
Jim Lemon <bitwrit at ozemail.com.au> writes:
Another approach is to use a relational database to store such a large
table and load the table into R from the database.  There are several
interfaces into R from relational databases.
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._