Skip to content

Hardwarefor R cpu 64 vs 32, dual vs quad

4 messages · Nic Larson, Brian Ripley, Henrik Bengtsson +1 more

#
On Tue, 9 Sep 2008, Nic Larson wrote:

            
No: you would need to arrange to parallelize the computations.  I'd be 
surprised if you got a computer within your budget that was 3x faster on a 
single CPU than your current one, and R will only use (unaided) one CPU 
for most tasks (the exception being some matrix algebra).
All answered in the R-admin manual, so please RTFM.

  
    
#
On Tue, Sep 9, 2008 at 6:31 AM, Nic Larson <niklar at gmail.com> wrote:
Faster machines won't do that much.  Without knowing what methods and
algorithms you are running, I bet you a beer that it can be made twice
as fast by just optimizing the code.  My claim applies recursively.
In other words, by optimizing the algorithms/code you can speed up
things quite a bit.  From experience, it is not unlikely to find
bottlenecks in generic algorithms that can be made 10-100 times
faster.  Here is *one* example illustrating that even when you think
the code is "fully optimized" you can still squeeze out more:

  http://wiki.r-project.org/rwiki/doku.php?id=tips:programming:code_optim2

So, start profiling your code to narrow down the parts that takes most
of the CPU time.  help(Rprof) is a start.  There is also a Section
'Profiling R code for speed' in 'Writing R Extensions'.  Good old
verbose print out of system.time() also helps.

My $.02 ...or 2000-3000USD if it was bounty?! ;)

/Henrik
#
Nic,

I'd buy a Mac Power server.

Not that's much faster, but while one of the cores toils away
at R you can play games on the others :-)-O

el

on 9/9/08 8:54 PM Henrik Bengtsson said the following: