Skip to content

Problem with raster package and nc4 file

6 messages · Andrea Timmermann, Michael Sumner, Robert J. Hijmans

#
Hi, 

I have daymet data in nc4 format and could open it with the ncdf4 package. I also have the coordinates of some polygons representing ecological areas. 
Both, the polygon vertices and the nc4 file have the same projection (LCC) and units (degrees). What I would like to do is to get the average temperature for each ecological area. So I wanted to use the polygons as a mask for the temperature raster and then get the average temperature for each polygon. 

I wanted to create a raster object (using the raster function in the raster package):

ncic <- nc_open("tmin_1980.nc4")
raster(ncic, 'tmin')

But I get this error:
Error in (function (classes, fdef, mtable)  : 
  unable to find an inherited method for function ?raster? for signature ?"ncdf4"?

Does anyone know how to fix this? Is there another way to get the data that I need?

I am a bit confused by the fact that the temperature data consists of one matrix with latitude data, one with longitude data and one with the temperature values. And by the fact that the spacing between the latitude and longitude matrices is not uniform. Is the raster package accounting somehow for this? 

Thanks a lot. Andi.
#
On Sat, 6 Jun 2015 at 08:04 Andrea Timmermann <timmermann at gmx.at> wrote:

            
Do you have grid data in LCC but stored with longitude/latitude
coordinates? This is unfortunate, but you can restore the LCC grid
probably. I would try

r <- raster("tmin_1980.nc4")

and then print it out and tell us the result (or post the errors if it
fails):

r

Do you have the details of the LCC projection? At a bare minimum you need
the centre longitude and centre latitude, and the two standard parallels,
and the datum/ellipsoid.  Can you open the file with ncdf4 and print out
this?

ncic <- nc_open("tmin_1980.nc4")
print(ncic)

 You are not supposed to mix use of the ncdf4 package functions with those
of raster, raster is designed to do all the work - but it does expect
sensibly specified regular grids.  If your data really are regular in LCC I
would recommend sorting that out. Can you share the file, or a link to an
analogous one?

Once you have it sorted out in raster, you can read your shapefile in and
do this

meanpolydata <- extract(r, poly, fun = mean)

Cheers, Mike.

What I would like to do is to get the average temperature for each

  
  
#
You don't need any hacky workarounds, this is a respectable NetCDF file. I
have an account on that system and found a file called "tmin_1999.nc4" -
this is large, ~ 1.5Gb.   Extracting pixel values with a polygon object is
pretty straightforward, but this is quite a large data set so

- test carefully on just a single layer
- check that the automatic transformation of your polygons to the grid by
extract() is sensible (or do it yourself)
- consider subsetting the grid if you don't need all of its extent in
space/time
- extract will be efficient for multilayer grids, but you might consider
use cellnumbers = TRUE if you want to save the index for later extractions

HTH

Cheers, Mike.

## this requires registration at
http://daac.ornl.gov/DAYMET/guides/Daymet_mosaics.html
##f <- "
ftp://ftp.daac.ornl.gov/data/daymet/Daymet_mosaics/data/tmin_1999.nc4"
##download.file(f, basename(f), mode = "wb")
## (could also use OpenDAP to read this if you have rgdal built with that)

## all the information is here:
##http://daac.ornl.gov/DAYMET/guides/Daymet_mosaics.html

## raster already knows:
r <- stack( basename(f))
class       : RasterStack
dimensions  : 4823, 5268, 25407564, 365  (nrow, ncol, ncell, nlayers)
resolution  : 1000, 1000  (x, y)
extent      : -2015000, 3253000, -3037000, 1786000  (xmin, xmax, ymin, ymax)
coord. ref. : +proj=lcc +lon_0=-100 +lat_0=42.5 +x_0=0 +y_0=0 +lat_1=25
+lat_2=60 +ellps=WGS84
names       : X1999.01.01, X1999.01.02, X1999.01.03, X1999.01.04,
X1999.01.05, X1999.01.06, X1999.01.07, X1999.01.08, X1999.01.09,
X1999.01.10, X1999.01.11, X1999.01.12, X1999.01.13, X1999.01.14,
X1999.01.15, ...

## check with a map
library(rworldmap)
data(countriesLow)
library(rgdal)
namerica <- subset(countriesLow, SOVEREIGNT %in% c("United States of
America", "Mexico", "Canada"))
wm <- spTransform(namerica, CRS(projection(r)))

plot(r[[1]])
plot(wm, add = TRUE)  ## looks fine to me . . .

## this code untested obviously
## read in your shapefile
# poly <- shapefile("poly.shp")
# polyvals <- extract(r, poly, fun = mean, na.rm = TRUE) ## use r[[1]] to
test on a single layer and see notes above

## dumb polygon extraction test - this will take some time to run, so try
small tests first
extract(r[[1]], wm, fun = mean, na.rm = TRUE)
On Sat, 6 Jun 2015 at 08:53 Michael Sumner <mdsumner at gmail.com> wrote:

            

  
  
#
Just another short comment below
On Sat, 6 Jun 2015 at 09:37 Andrea Timmermann <timmermann at gmx.at> wrote:

            
There's nothing to fix, this is a regular grid in metres and is sensibly
specified by an extent in the projected coordinates - you don't need to
work with every cell's coordinate explicitly, and you can always generate
them on the fly as required - in whatever coordinate system you want. But I
don't think you need that at all.

Storing every coordinate in lon/lat arrays in that file is just an
unnecessary waste really, this is one of those "modelling/GIS" crossover
areas where things just don't quite mesh in a sensible way.

Cheers, Mike.

  
  
#
Andrea,

One small addition. To get all layers (one for each day, in stead of
the single layer you get with 'raster'), use the brick function:

b <- brick("tmin_1980.nc4")

Robert
On Fri, Jun 5, 2015 at 4:52 PM, Michael Sumner <mdsumner at gmail.com> wrote: