memory allocation problem with envelope function in spatstat
Hi Erika, Yes, R's classic warning that you don't have enough memory. I've done K function analysis with much larger (19,000 point) data sets on a machine with 3.5GB (running 32 bit Linux), so what you are doing should be possible. One thought is to create a new R workspace with only your point data set in it. R wants to hold the workspace in memory, so if you have a lot of stuff lying around your workspace, you could run into problems. Dan
On Fri, 2010-05-07 at 13:06 -0700, Erika Mudrak wrote:
Hi everyone- I am trying to fit an inhomogenous K function based to a large(ish) dataset. I have successfully executed this code on data sets with a smaller n.
SPECIES.ppp
planar point pattern: 1598 points window: rectangle = [564320, 564320] x [228490, 228500] units
lambda=density(SPECIES.ppp) Lin=envelope(SPECIES.ppp,Linhom,nsim=5,simulate=expression(rpoispp(lambda)),
+ correction="border") Error: cannot allocate vector of size 244.5 Mb I am working on a Windows Vista OS with 3GB RAM, and I have attempted to increase the memory size using memory.limit(2000) within R and by adding --max-vsize=2000M to the end of the Target field in the shortcut tab of the R properties dialog box. I have also tried making a list of simulated patterns ahead of time, but I get the same error:
simpatterns=list() for (i in 1:5)simpatterns[[i]]=rpoispp(lambda) Lin=envelope(SPECIES.ppp,Linhom,nsim=5,simulate=simpatterns,
+ correction="border") Error: cannot allocate vector of size 244.5 Mb The spatstat FAQ suggest that a complete analysis should be feasible on a data set up of to 4000 points. Can anyone suggest ways to increase my memory limit or to modify my code to make this possible? I don't know what else to try!! Ideally, I would like to be able to make nsim=200, but I am trying to make it work with small numbers for now. Thank you, Erika Mudrak
Dan Putler Sauder School of Business University of British Columbia