R on a supercomputer
In general, R is not written in such a way that data remain in cache. However, R can use optimized BLAS libraries, and these are. So if your version of R is compiled to use an optimized BLAS library appropriate to the machine (e.g., ATLAS, or Prof. Goto's Blas), AND a considerable amount of the computation done in your R program involves basic linear algebra (matrix multiplication, etc.), then you might see a good speedup. -- Tony Plate
Kimpel, Mark William wrote:
I am using R with Bioconductor to perform analyses on large datasets using bootstrap methods. In an attempt to speed up my work, I have inquired about using our local supercomputer and asked the administrator if he thought R would run faster on our parallel network. I received the following reply: "The second benefit is that the processors have large caches. Briefly, everything is loaded into cache before going into the processor. With large caches, there is less movement of data between memory and cache, and this can save quite a bit of time. Indeed, when programmers optimize code they usually think about how to do things to keep data in cache as long as possible. Whether you would receive any benefit from larger cache depends on how R is written. If it's written such that data remain in cache, the speed-up could be considerable, but I have no way to predict it." My question is, "is R written such that data remain in cache?" Thanks, Mark W. Kimpel MD Indiana University School of Medicine [[alternative HTML version deleted]]
______________________________________________ R-help at stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html