Skip to content
Back to formatted view

Raw Message

Message-ID: <6rr9aikfj6.fsf@franz.stat.wisc.edu>
Date: 2000-05-31T13:36:29Z
From: Douglas Bates
Subject: heap size trouble
In-Reply-To: Jim Lemon's message of "Wed, 31 May 2000 21:51:13 +1000"

Jim Lemon <bitwrit at ozemail.com.au> writes:

> karamian wrote:
> 
> ...I want to load a file that contains 93 thousand raws and 22 colums of
> data (essentially float)...
> 
> I just had to process over 199000 records with four numeric values.  If
> I remember correctly, I used:
> 
> --vsize 30M  --nsize 500000
> 
> which pretty much ate all the RAM (64M) I had.  Don't forget to "rm" big
> data sets before you exit, or R will bomb when you next try to load
> without the increased memory.  Just reread from the data file when you
> need them again (and it helps to exit other apps before starting R to
> avoid disk thrashing).

Another approach is to use a relational database to store such a large
table and load the table into R from the database.  There are several
interfaces into R from relational databases.
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._