memory - cannot allocate vector of size 'x'
On Tue, 4 Mar 2008, Jose Funes wrote:
Roger, I am running it in Windows and is 8 bytes. I tried to use the gc() for the regression kriging but it did not work, instead I did employ the gctorture() that turn "R" to run extremely slow (it is still running, almost 8 hours so far, it may finish tomorrow). For the regression kriging, I am using 163 data points: R code: Regression kriging ystrk = krige(yststep$call$formula,ystdec3,predictors,vgf1yst) where: yststep: is the regression residual ystdec3: are the data points (163 points) predictors: are the asciigrid file containing the predictor variables (9 raster layers) vgf1yst: fitted variogram
But no use of maxdist= or nmax=? I would have said: g1 <- gstat(formula=yststep$call$formula, data=ystdec3, model=vgf1yst) ystrk <- predict(g1, newdata=predictors) anyway, which shows exactly how to resolve the problem - subset ystdec3 into tiles and patch them together again afterwards (patching may need thinking through clearly). I assume that you have now successfully read in the predictors? Do I understand that you are calibrating your model on 163 observations, and predicting for roughly four orders of magnitude more grid cells? gc() can help to clean out garbage sometimes - say in a loop, gctorture() really only during development, not in regular use. Roger
error message: In addition: There were 12 warnings (use warnings() to see them) warning messages: 1: Reached total allocation of 1535Mb: see help(memory.size) ... 3: In slot(value, what) <- slot(from, what) : ... 8: In slot(value, what) <- slot(from, what) : Reached total allocation of 1535Mb: see help(memory.size) ... 12: In `slot<-`(`*tmp*`, what, value = structure(c(-293438.89990765, ... : Reached total allocation of 1535Mb: see help(memory.size) Is there another way to get around garbage collection than gctorture().? Sincerely, Jose On Tue, Mar 4, 2008 at 3:48 AM, Roger Bivand <Roger.Bivand at nhh.no> wrote:
On Mon, 3 Mar 2008, Jose Funes wrote:
Dear members, I would like to share some of the problems that I have run into R when importing ascii files. I described the problem below, if any have experienced similar issues I will appreciate you support. I am working with raster data(maps) and importing it as ascii files to R. I am stacking 9 layers (10MB~each, total 90MB). However, when stacking the eight layer I get the following message "Error: cannot allocate vector of size 10.8 Mb". I did a little of bit of reading about this, and some suggestions are to increase the memory allocation. I increased it to 3GB using the following command: memory.limit(size=3000) but the problem still persist.
Which platform? Windows? Its performance is systematically worse than other OS on the same hardware. When you say 10Mb, is this 8 bytes * roughly 1.2 million cells? Are you running gc() between calls to readAsciiGrid() to force garbage collection? While readAsciiGrid() is working, I think that at least three copies of the data are in memory. Have you tried using rgdal? Maybe readGDAL() keeps fewer internal copies? (in readGDAL, that may be at least two copies, but I'm not sure (NAs are handled by GDAL)). Because input needs to be converted from representation to representation, it is not possible to avoid multiple copies.
code:
1. predictorslide2 <- readAsciiGrid("sol_spr_lide2.asc")
...
...
8. predictorslide2 <- cbind(predictorslide2,readAsciiGrid("umca_pr_lide2.asc"))
It may be possible to get round cbind() copying if that turns out to be the problem, by taking the SpatialGrid from the first read, putting it aside, and storing the single columns of the data slots of read data in a single, pre-allocated data frame. Re-allocating to predictorslide2 successively is not a good idea for large objects - Braun & Murdoch in their nice book on R programming treat it as a worst case on speed and memory usage.
Also I got a similar error when running regression kriging: Error: cannot allocate vector of size 6.7 Mb
Without the command, this isn't informative. Were you using local areas (maximum distance or # data points)? How large were the data= and newdata= objects? Roger
In addition: There were 12 warnings (use warnings() to see them) warning messages: 1: Reached total allocation of 1535Mb: see help(memory.size) ... 3: In slot(value, what) <- slot(from, what) : ... 8: In slot(value, what) <- slot(from, what) : Reached total allocation of 1535Mb: see help(memory.size) ... 12: In `slot<-`(`*tmp*`, what, value = structure(c(-293438.89990765, ... : Reached total allocation of 1535Mb: see help(memory.size) I will appreciate your suggestions. I have also considered splitting the data but would like to explore other solutions. Regards, Jose Funes
_______________________________________________ R-sig-Geo mailing list R-sig-Geo at stat.math.ethz.ch https://stat.ethz.ch/mailman/listinfo/r-sig-geo
-- Roger Bivand Economic Geography Section, Department of Economics, Norwegian School of Economics and Business Administration, Helleveien 30, N-5045 Bergen, Norway. voice: +47 55 95 93 55; fax +47 55 95 95 43 e-mail: Roger.Bivand at nhh.no
Roger Bivand Economic Geography Section, Department of Economics, Norwegian School of Economics and Business Administration, Helleveien 30, N-5045 Bergen, Norway. voice: +47 55 95 93 55; fax +47 55 95 95 43 e-mail: Roger.Bivand at nhh.no