Skip to content

Memory limit for Windows 64bit build of R

5 messages · Alan.X.Simpson at nab.com.au, Jie, David Winsemius +2 more

#
On Aug 5, 2012, at 3:52 PM, Alan.X.Simpson at nab.com.au wrote:

            
It may depend in part on how that number is arrived at. And what you  
plan on doing with it. (Don't consider creating a dist-object.)
The trypical advices is you will need memory that is 3 times as large  
as a large dataset, and I find that even more headroom is needed. I  
have 32GB and my larger datasets occupy 5-6 GB and I generally have  
few problems. I had quite a few problems with 18 GB, so I think the  
ratio should be 4-5 x your 10GB object.  I predict you could get by  
with 64GB. (please send check for half the difference in cost between  
64GB abd 128 GB.)
#
On 06.08.2012 09:34, David Winsemius wrote:
10Gb objects should be fine, but note that a vector/array/matrix cannot 
exceed  2^31-1 elements, hence a 17Gb vector/matrix/array of doubles / 
reals.

Best,
Uwe Ligges
#
On 06/08/2012 09:42, Uwe Ligges wrote:
The advice is 'at least 3 times'.  It all depends what you are doing 
(and how slow your swap is -- on Windows it is likely to be slow; on a 
Linux box with a fast SSD it can be viable to use swap).
But 3 x 18GB > 32GB!
That's true for R 2.15.1, but not the development version.  Further, 
R-devel makes substantially fewer copies of objects, most of which 
improvements have been ported to R-patched.

dist() is one example of substantial improvements.