Skip to content

R memory usage and size limits

2 messages · Tom Quarendon, Brian Ripley

#
I have a general question about R's usage or memory and what limits 
exist on the size of datasets it can deal with.
My understanding was that all object in a session are held in memory. 
This implies that you're limited in the size of datasets that you can 
process by the amount of memory you've got access to (be it physical or 
paging). Is this true? Or does R store objects on disk and page them in 
as parts are needed in the way that SAS does?
Are there 64 bit versions of R that can therefore deal with much larger 
objects?

Many thanks.
#
Please read ?"Memory-limits" and the R-admin manual for basic 
information.
On Thu, 5 Feb 2009, Tom Quarendon wrote:

            
That's rather a false dichotomy: paging uses the disk, so the 
distinction is if R implemented its own virtual memory system or uses 
the OS's one (the latter).

There are also interfaces to DBMSs for use with large datasets: see 
the R-data manual and also look at the package list in the FAQ.
Yes, there have been 64-bit versions of R for many years, and they are 
in routine use on very large problems.