Skip to content
Prev 4912 / 29559 Next

memory issue on R with Linux 64

Well, this doesn't come as a surprise; if it did for you then you didn't 
read the list archives well.

R has been designed for analysing statistical data, which usually 
doesn't outnumber billions of observations, and not for 
analysis/processing of large grids/imagery.

rgdal has infrastructure to let you go through huge grids by reading and 
writing only parts at a time; you can find pointers to this in the rgdal 
documentation, examples on the list. I don't know of functions that do 
this automatically for you; maybe the raster package on r-forge?

Another option is to buy more ram. I am using debian on a 32 Gb ram 
workstation; I was surprised (really) how little it costed. It saves me 
time.
--
Edzer
Alexander.Herr at csiro.au wrote: