Skip to content

spsample taking excessive memory

1 message · Roger Bivand

#
On Sat, 30 Apr 2011, Dylan Beaudette wrote:

            
Hi,

The polygons are very small, spread out over a very large area, so 
spsample tries to generate many points for the bounding box of the object, 
rejecting most of them. As Dylan suggests, you can get a long way by 
stepping only to the polygons you need:

for (x in 1:length(uniqueAtt)) {
   class_data<- vec[vec[[attName]]==uniqueAtt[x],]
   areas <- sapply(slot(class_data, "polygons"), slot, "area")
   nsamps <- ceiling(numsamps*(areas/sum(areas)))
   for (i in 1:dim(class_data)[1]) {
     XY <- spsample(class_data[i,], type="random", n=nsamps[i])
# step Polygons object by Polygons object, dividing the numsamps by area
     if (i == 1) cpts <- XY
     else cpts <- rbind(cpts, XY)
   }
# maybe need to modify the number of points to match numsamps exactly.
   classpts <- cpts
   if (x == 1) {
     xy<- classpts
   } else {
     xy<- rbind(xy, classpts)
   }
}

The spsample() method for SpatialPolygons does really expect them to be 
close to each other, and to fill the object bounding box at least in some 
large part.

Hope this helps,

Roger