From the R-FAQ, it states that;
R (currently) uses a _static_ memory model. This means that when it
starts up, it asks the operating system to reserve a fixed amount of memory
for it. The size of this chunk cannot be changed subsequently. Hence, it
can happen that not enough memory was allocated, e.g., when trying to read
large data sets into R.
out of curiousity, what is the upper limit of data size that R can process
in term or number of rows/columns or in MBytes? Or, if this limit exist, is
it hardware related? (e.g computer with 256MB can process more data than one
with 64MB)
This is about to change in 1.2. Luke Tierney has rewritten the memory
management in R so that this restriction no longer applies. On the other
hand, the computational model used within R is really only suitable for
data sets consisting of at most a few 10s of megabytes. The problem is
that data sets are memory resident and some computations will copy the
entire data set.