memory issue on R with Linux 64
Well, this doesn't come as a surprise; if it did for you then you didn't read the list archives well. R has been designed for analysing statistical data, which usually doesn't outnumber billions of observations, and not for analysis/processing of large grids/imagery. rgdal has infrastructure to let you go through huge grids by reading and writing only parts at a time; you can find pointers to this in the rgdal documentation, examples on the list. I don't know of functions that do this automatically for you; maybe the raster package on r-forge? Another option is to buy more ram. I am using debian on a 32 Gb ram workstation; I was surprised (really) how little it costed. It saves me time. -- Edzer
Alexander.Herr at csiro.au wrote:
Hi List,
I get an error using readGDAL{rgdal}: cannot allocate vector of size 3.1 Gb
I am using Linux 64bit (opensuse 11) with 4 gig swap and 4 gig Ram and R 2.8.0.
The load monitor shows that most of Ram is used up and then when Swap use starts increasing, R returns the error.
Is there anything I should do within R to circumvent this?
Any help appreciated
Thanks
Herry
_______________________________________________ R-sig-Geo mailing list R-sig-Geo at stat.math.ethz.ch https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Edzer Pebesma Institute for Geoinformatics (ifgi), University of M?nster Weseler Stra?e 253, 48151 M?nster, Germany. Phone: +49 251 8333081, Fax: +49 251 8339763 http://ifgi.uni-muenster.de/ http://www.springer.com/978-0-387-78170-9 e.pebesma at wwu.de