This chunk of code writes a character vector to a file. It needs two
dimensions, one that has to be big enough for the longest character
string, and one which has to be the length of the vector.
writeChars <- function(fn){
nameDim = dim.def.ncdf("NamesDim","", 1:3)
charDim = dim.def.ncdf("CharsDim","", 1:10)
nameVar = var.def.ncdf("Names","",list(charDim,nameDim),"NA")
nc = create.ncdf(fn, list(nameVar))
v = c("red ","green","blue ")
for(i in 1:length(v)){
put.var.ncdf(nc, nameVar, v[i], start=c(1,i), count=c(-1,1))
}
close.ncdf(nc)
}
Dumping the NetCDF gives:
data:
CharsDim = 1, 2, 3, 4, 5, 6, 7, 8, 9, 10 ;
NamesDim = 1, 2, 3 ;
Names =
"red ",
"green",
"blue " ;
}
Whether this is a good idea or not... Hmmmm....
I've noticed if you use `writeRaster` with format="CDF" you get the
data in one variable called "value", but if you use format="netCDF"
you get one named variable for each layer. I'm pretty sure these names
could be set and recovered, but currently they seem to be called Band1
etc. I think thats the GDAL driver at work:
"""
This driver supports creation of netCDF file following the CF-1
convention. You may create set of 2D
datasets. Each variable array is named Band1, Band2, ... BandN.
""" http://www.gdal.org/frmt_netcdf.html
Also, this section in writeRaster:
NetCDF files have the following additional, optional,
arguments: ?varname?, ?varunit?, ?longname?, ?xname?,
?yname?, ?zname?, ?zunit?
only seems to apply to format="CDF" and not "NetCDF", which seems to
ignore at least the ones I've tried. The "NetCDF" format seems to be
done via GDAL.
Is it worth putting any effort into this? I mean, is it worth *Robert*
putting any effort into this?
Barry
writeRaster does not preserve names when writing to NetCDF
3 messages · Barry Rowlingson, Gmail, Roger Bivand
12 days later
Dear all, I keep getting a memory error (vector too big) when overlaying hundreds of thousands of points with a polygon layer. The computer I am running this on is quite powerful, thus the error is just internal to R because of reaching a vector allocation too big in size. Is there any workaround to execute this operation? My goal is to overlay several point onto a habitat polygon layer, and remove all points falling outside that polygon. Any help is much appreciated. Thank you, Francesco
On Mon, 25 Aug 2014, Francesco Tonini wrote:
Dear all, I keep getting a memory error (vector too big) when overlaying hundreds of thousands of points with a polygon layer. The computer I am running this on is quite powerful, thus the error is just internal to R because of reaching a vector allocation too big in size. Is there any workaround to execute this operation? My goal is to overlay several point onto a habitat polygon layer, and remove all points falling outside that polygon.
Please recall that nobody can see over your shoulder. Always state the output of sessionInfo(), the code causing the error, the output of traceback() after the error, and some details of the objects concerned (counts of points, for example). Are you for example using over() methods from the sp package? Powerful may also be misconfigured, so available RAM and any adjustments you have made to R defaults should be declared. Do read the posting guide! Roger
Any help is much appreciated. Thank you, Francesco
_______________________________________________ R-sig-Geo mailing list R-sig-Geo at r-project.org https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Roger Bivand Department of Economics, Norwegian School of Economics, Helleveien 30, N-5045 Bergen, Norway. voice: +47 55 95 93 55; fax +47 55 95 91 00 e-mail: Roger.Bivand at nhh.no