Skip to content
Prev 27425 / 29559 Next

slow computation progress for calc function

I have used the slice approach with success using complex functions and
multiple outputs on 12,000+ layers (3,000 X 3,000 cells approx) loaded into
chunked NetCDF files on a desktop machine, so this should work.

This was all done using the ncdf4 and raster packages. There is some work
involved in setting up the input / output NetCDF files though. The trick
was to select a chunking strategy that minimises row-wise read times
through the time series, then extract slices for each row into a matrix
using the ncdf4 package and use apply() with your custom functions.

The majority of overhead will be in read / write if you're using rle with
one output. I suspect clusteR / calc will be a lot faster on a chunked
NetCDF as well... I've seen some huge speed improvements before but it was
a special case with fewer layers and more computationally expensively
functions.

In any case, stacking 8,000 separate rasters is going to be super slow for
processing in R unless you use something like NetCDF.
On Tue., 25 Jun. 2019, 9:45 pm Roger Bivand, <Roger.Bivand at nhh.no> wrote: