DEMs
Hello Abe, my thoughts to your questions are listed inline...
On Sunday 13 September 2009 15:05:04 Aben Woha wrote:
Hello All, My questions are regarding the best way to obtain DEMs for the entire globe. I am trying to krige a dataset of global scope. The dataset has points at all latitude/longitude intersects and so there are 64,800 sample points. The problem that I am having is that the DEMs I have been using are too large (memory) to be used as R variables. I also have had trouble finding a way to download DEMs for the entire globe. 1. Given the size of my sample dataset: What is the maximum resolution of DEM that I should be using? Do I only need values at prediction points or will higher res yield higher accuracy?
As coarse as possible, since you only appear to need elevations at each 1x1 degree interval (note that many of these locations are nearby or identical for extremely high and low latitudes, due to convergence of longitudes at the poles). This question is more complicated if what you really want are average elevations per 1-degree 'chunk', or you want to calculate aspect, slope, or some other derivative. I would check out GTOPO30, a 30-arc second global DEM: http://eros.usgs.gov/products/elevation/gtopo30/gtopo30.html The accuracy of this depends on the source data, which is reported at the site listed above. There are other possibilities, but I don't know if the sources are as recent or whether you'd get much in the way of accuracy statistics, etc. See: http://dss.ucar.edu/catalogs/ranges/range750.html for more possibilities.
2. Is there a way to programmatically access DEMs using GDAL or R in general? ie. internally or through integration with McIDAS, NetCDF, etc.
Yes, check out the rgdal library. You can also use gdal (and maybe the python bindings) to preprocess the dataset outside of R - perhaps to thin it to just the lat-lons you need - prior to importing. I have used PyGDAL to access (and sample from) 10s of gigabytes of tiled (e.g. in separate files) elevation data, which I subsequently loaded into R. I think others here have done the same from within GIS as well.
3. Do I need to break the DEMs and sample dataset into smaller tiles to avoid the memory issues? If so how do I do that in R / GDAL?
GTOPO30 comes in maybe 3 dozen tiles, ranging in size from 5-25 MB zipped. On modern hardware it's not really that big, and R can probably load it all without problem if you have enough RAM. My personal preference would probably be to extract the elevations I wanted in home-brewed Python, write to some sort of comma-delimited file, and read that into R - and I have no problems on my 2 year old pc loading 64,000 records of xyz data, or actually many times that number. There are surely simpler solutions that wouldn't involve much if any coding, but I have direct experience with that approach. Good luck! Ashton
Ashton Shortridge Associate Professor ashton at msu.edu Dept of Geography http://www.msu.edu/~ashton 235 Geography Building ph (517) 432-3561 Michigan State University fx (517) 432-1671