Skip to content

[R-sig-dyn-mod] Parallel deSolve on AWS

1 message · John Harrold

#
Hey Folks,

So I'm slowly working my way through this. I've decided to try to get
things going using multiple cores on the same computer first, work out
the bugs there then try to move this over to multiple computers. I'm
using foreach/%dopar%  (see the example below). When testing this on
different computers it seems that the environment gets transferred in
an inconsistent way :). I think I've managed to fix this with respect
to packages using the .packages option. However I've got functions
that I wrote myself and I've also run the model compiled. What I've
found that seems to work (more or less consistently) is the following:

n = 100
cores=detectCores()
cl <- makeCluster(cores[1]-1)
registerDoParallel(cl)
xall_p <- foreach(i=1:n,.packages=c("deSolve")) %dopar% {
   source('myfunc.r')
   dyn.load(paste("ode_model", .Platform$dynlib.ext, sep = ""))
   x = run_simulation(args)

   x }
stopCluster(cl)

You can see that within the loop I'm sourcing myfunc.r and I'm loading
the dynamic C library as well. Now my question :).

Is this the most appropriate way to do this? It seems, and this may be
my lack of understanding of the foreach function, that I'm trying to
load things at every iteration. Is this correct?

Thanks
John

Thanks
John
On Mon, Mar 27, 2017 at 7:45 AM, Tim Keitt <tkeitt at utexas.edu> wrote: