Skip to content
Prev 201976 / 398503 Next

R on Large Data Sets (again)

Hello Lars,
On 2009.11.28 18:53:09, Lars Bishop wrote:
I think you'll have to provide a more precise definition of
"large"---are we talking 1 GB of records or 100 GB? Also, it would help
to know what you are trying to do with the data. The documentation for
the biglm and bigmemory packages may provide some help.
I'm not familiar enough with the commercial version of R, but I do
believe it provides better support for parallelization, which may be of
some help. I don't think, however, that this version will "solve" your
problem.
Possibly, but Win64 should provide plenty of memory (I believe Windows 7
Ultimate can use up to 192 GB of memory). You just have to find the
system that can take that much... With Unix/Linux you can probably cut
back some overhead, and the memory management is most likely better, but
unless you need to go over 192GB of memory, you don't necessarily have
to move to a different platform. 

~Jason