Skip to content
Prev 994 / 2152 Next

multicore with functions calling an exe/.sh file

Thanks Steve for your answer.
Stephen Weston wrote:
Yes, I'm running an external program with mclapply, and I thought that 
there could exist some package that mange this automatically, by 
creating a copy of the working process (and all the files) on the local 
memory, in some way, but probably it was just too much imagination :)


Usually that would be done with a command line
I think this could be the only way of solving my problem. However, I 
also the additional issue that each process needs to modify the same 
position of the same input files at the same time. So I should also add 
an argument for the location of the input files.

  If you can't do that, you could try executing them
The external program I'm running now doesn't have an option for setting 
the location of the input/output files. So probably the fastest solution 
could be to create as many copies of the input files as the amount of 
cores I want to use, and then to re-direct (in some way) each process to 
a different directory, each one storing all the necessary input files.

I wanted to avoid the previous solution, but I think is the only way to 
go forward...
Good question.
The directory that hold the input/output/exe/.sh files is:

model.drty <- "/home/zambrhe/S090-test"

and that is the argument I pass to R for executing my script.

However, my '/home/zambrhe/' directory is in a network drive, ad 
probably R see it as "Z:\home\zambrhe\S090-test\file.cio" for some reason...



Thanks you very much again.

Mauricio