Using adehabitat and splancs to achieve point-in-polygon analysis on raster?
On Fri, 25 Mar 2005, Cl?ment Calenge wrote:
Hope you had a good time in the field! Is it spring in France?
Yes, though a little cloudy. And it is certainly colder than in South Africa!
Probably about the same in Bergen, quite warm for the time of year, snow still on the hills around town.
Thanks for the very helpful reply! I will try it right now! I need to present at a conference in early April and would like to get these results before that! Just three quick questions before you go to enjoy the Easter weekend: - how do I get the values out of 'jj' (## jj contains the values of elevation inside the home range of the animals) to calculate mean, variance, etc, across all the points in 'jj'? I assume 'jj' is of class 'kasc'. In my case it might be 'asc' as I will have only a single home range at the time (one animal).
Just a small correction to my previous e-mail. If you creates jj with: jj<-lapply(foo, function(x) x*el) jj is not a map of class "kasc". To convert jj to the class "kasc", copy the attributes of the map "foo" to jj with getkascattr: jj<-getkascattr(foo, jj) image(jj) And now, jj is map of class "kasc", i.e. a data.frame with each column corresponding to a map. All the pixels located inside the home ranges are coded by numeric values in the data frame, and pixels located outside the home ranges are coded by NA. So you can get the pixels values with something like: oo<-lapply(jj, function(x) x[!is.na(x)]) oo is a list with one component per animal. Each component corresponds to a vector of pixels inside the animals home range. It is quite easy after that to get values such as average, mean, median, etc.: lapply(oo, summary) lapply(oo, mean) lapply(oo, sd) etc.
- can I get the points out of the 'kasc' object in a way that I can still run spatial statistics on them? I would like to calculate the nugget and range for the NDVI values within the home range.
I never performed such spatial statistics with home ranges but this could certainly be done... IMHO, the best way to achieve is also to get the coordinates of the kept pixels when computing the vector oo: xyc<-getXYcoords(jj) ## xyc contains the coordinates of the pixels of the maps ## it is a list with two components, the vectors of coordinates ## of the pixels of the maps ## compute the x values xc <- rep(xyc$x, times=length(xyc$y)) yc <- rep(xyc$y, each=length(xyc$x)) xyc<-data.frame(x=xc,y=yc) lixyc<-lapply(1:ncol(jj), function(x) xyc[!is.na(jj[,x]),]) ## So that each point in oo is associated to its coordinates ## in lixyc ## for example plot(lixyc[[1]], col=grey(oo[[1]]/300), pch=15, cex=4, asp=1) plot(lixyc[[2]], col=grey(oo[[2]]/300), pch=15, cex=4, asp=1) plot(lixyc[[3]], col=grey(oo[[3]]/300), pch=15, cex=4, asp=1) Then you have to compute a variogram. I never used functions allowing such analyses, but it seems that the package nlme has a function Variogram...
- do you know how to convert a polygon shapefile into the same class as produced by the command kernelUD? Alternatively I could import a raster version of the polygon shape file, convert it to the correct format, and repeat the spatial join method you are suggesting for the home ranges!
I'm sorry, I do not have any easy solution to this problem. The function kernelUD returns a *raster* map of class "asc", a central class in adehabitat. I don't know how to rasterize a shapefile in R (the library shapefiles or Maptools, which deal with shapefiles do not allow such operations). Maybe the simpler way would be to rasterize the shapefile with a GIS...
This is something we've been looking at for the sp package, how to make an object that represents the raster cells falling into one or more rings. It could be a grid with NA outside the vector MASK, or a cell object with just the (regularly spaced) points inside the MASK. We are not there yet, and point-in-polygon is as close as we've got yet (that is, generate the full raster using expand.grid() and test each point in turn. This is needed, but isn't finished yet. Writing good functions to suit adehabitat package objects is going to be important, so the work flow you, Sander, have described, is very useful. It would be nice to be able to say that this is ready now, but it isn't yet ... At least one of the things sp is trying to do is provide shared object structures so that writing functions from/to different polygon representations can be done from -> sp and sp -> to, so that various packages should be able to "talk to" each other with fewer barriers. Best wishes, Roger
Hope this helps, Clem.
Thanks again, Sander. Cl?ment Calenge wrote:
Dear Sander, Sorry for my late answer, I was in the field.
I send this email to you directly, because it concerns the link between Adehabitat and Splancs. I have two sources of polygons: - polygon shape files with areas of interest; - polygons calculated using the home range methods in Adehabitat. For the moment, each polygon source will only contain a single polygon. I have one source of points: a grid representing NDVI values. I read the grid using the import.asc function in Adehabitat. I would like to use both polygon sources to select the points in the NDVI raster which are inside the relevant polygons. Then I want to run several statistics (mean, standard deviation, etc) on the NDVI values of these points.
The solution depends on what you call "polygon". The function
mcp.rast() is intended to "rasterize"
a polygon on a raster grid of class "asc" or "kasc". The polygon is
here stored as a data frame with
two columns (the X and Y coordinates of the polygon). The function
hr.rast() can be used if you have
an object of class "area", i.e. a data.frame with three columns (the
X and Y coordinates of the polygon,
and a third column defining the ID of the polygon).
Now, it seems that you work with more complex type of polygons, e.g.
kernel estimations of home ranges
(where each animal has an home range, and each home range can be made
of several polygons). In such cases,
the "Spatial join" operation can be more complex. I detail below an
example of "spatial join" with a kernel
estimation of wild boar home ranges. Note, that in this case, you do
not have to compute the contour of
the estimation. Just copy and paste the lines below on R:
## loads and display the data
library(adehabitat)
data(puechabon)
## keep only the elevation
el<-getkasc(puechabon$kasc,1)
image(el)
## estimate a kernel home-range
kud<-kernelUD(puechabon$locs[,c("X","Y")], puechabon$locs$Name, grid=el)
## kud contains a density
## note that the grid containing the values of the elevation is used
for the estimation
vud<-getvolumeUD(kud)
image(vud)
## vud contains the limits of the home ranges estimated at different
levels
toto<-lapply(vud, function(x) x$UD)
## toto is the list of maps of Utilization Distribution
tutu<-lapply(toto, function(x) {x[x<95]<-1; x[x>95]<-NA; return(x)})
## tutu is a list of raster maps containing 1 if the pixel is inside
## the 95% home range and NA otherwise
foo<-as.kasc(tutu)
image(foo)
## foo is a kasc object
jj=lapply(foo, function(x) x*el)
image(jj)
## jj contains the values of elevation inside the home range of the
animals
The last method that is available for home-range estimation of
animals home ranges
is the Nearest Neighbor Convex hull method of Getz and Wilmers
(2004). The function
NNCH.rast() can be used to "rasterize" the home range estimation in
the same way, and then
you can multiply the resulting maps with your NDVI in the same way.
Hope this helps,
Cl?ment.
Roger Bivand Economic Geography Section, Department of Economics, Norwegian School of Economics and Business Administration, Breiviksveien 40, N-5045 Bergen, Norway. voice: +47 55 95 93 55; fax +47 55 95 93 93 e-mail: Roger.Bivand at nhh.no