Skip to content
Prev 1794 / 2152 Next

R-sig-hpc Digest, Vol 61, Issue 4

Srihari,

mpd is a part of mpich (and mvapich/intel mpi), not OpenMPI, so your application probably isn't running the way you anticipated at all.  This is probably due to your login environment ($PATH, $LD_LIBRARY_PATH, etc) not being set up to use your new OpenMPI installation.

The errors you sent in the previous post sound like system-specific issues that require information specific to your cluster--it may be more effective to work with your local cluster admin (if you have one!) to iron out the issues arising from your MPI stack.  I don't think the problem is with R, but just take a step back and make sure you have the necessary essentials:

1. Your MPI stack needs to be compiled with a specific compiler (it sounds like you have Intel and GCC)
2. Your R needs to also be compiled with that same compiler
3. Your Rmpi needs to be compiled against the MPI stack in #1 and the same compiler as #1 and #2 (this may be non-trivial)
4. All of these need to be accessible from your local environment (either via some "module" command, which is common on most clusters, or by correctly setting $PATH, $LD_LIBRARY_PATH, etc in your .bashrc or .cshrc)

If you are missing (or unsure) of any of these points, your local cluster admin is probably best qualified to sort these out for you.  Good luck!

Glenn
On Oct 22, 2013, at 10:05 AM, Srihari Radhakrishnan <srihari at iastate.edu> wrote: