An embedded and charset-unspecified text was scrubbed... Name: not available URL: <https://stat.ethz.ch/pipermail/r-help/attachments/20080909/6835c4b0/attachment.pl>
Hardwarefor R cpu 64 vs 32, dual vs quad
4 messages · Nic Larson, Brian Ripley, Henrik Bengtsson +1 more
On Tue, 9 Sep 2008, Nic Larson wrote:
Need to buy fast computer for running R on. Today we use 2,8 MHz intel D cpu and the calculations takes around 15 days. Is it possible to get the same calculations down to minutes/hours by only changing the hardware?
No: you would need to arrange to parallelize the computations. I'd be surprised if you got a computer within your budget that was 3x faster on a single CPU than your current one, and R will only use (unaided) one CPU for most tasks (the exception being some matrix algebra).
Should I go for an really fast dual 32 bit cpu and run R over linux or xp or go for an quad core / 64 bit cpu? Is it effective to run R on 64 bit (and problem free (running/installing))???
All answered in the R-admin manual, so please RTFM.
Have around 2000-3000 euro to spend Thanx for any tip [[alternative HTML version deleted]]
______________________________________________ R-help at r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.
Brian D. Ripley, ripley at stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UK Fax: +44 1865 272595
On Tue, Sep 9, 2008 at 6:31 AM, Nic Larson <niklar at gmail.com> wrote:
Need to buy fast computer for running R on. Today we use 2,8 MHz intel D cpu and the calculations takes around 15 days. Is it possible to get the same calculations down to minutes/hours by only changing the hardware? Should I go for an really fast dual 32 bit cpu and run R over linux or xp or go for an quad core / 64 bit cpu? Is it effective to run R on 64 bit (and problem free (running/installing))??? Have around 2000-3000 euro to spend
Faster machines won't do that much. Without knowing what methods and algorithms you are running, I bet you a beer that it can be made twice as fast by just optimizing the code. My claim applies recursively. In other words, by optimizing the algorithms/code you can speed up things quite a bit. From experience, it is not unlikely to find bottlenecks in generic algorithms that can be made 10-100 times faster. Here is *one* example illustrating that even when you think the code is "fully optimized" you can still squeeze out more: http://wiki.r-project.org/rwiki/doku.php?id=tips:programming:code_optim2 So, start profiling your code to narrow down the parts that takes most of the CPU time. help(Rprof) is a start. There is also a Section 'Profiling R code for speed' in 'Writing R Extensions'. Good old verbose print out of system.time() also helps. My $.02 ...or 2000-3000USD if it was bounty?! ;) /Henrik
Thanx for any tip
[[alternative HTML version deleted]]
______________________________________________ R-help at r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.
Nic, I'd buy a Mac Power server. Not that's much faster, but while one of the cores toils away at R you can play games on the others :-)-O el on 9/9/08 8:54 PM Henrik Bengtsson said the following:
On Tue, Sep 9, 2008 at 6:31 AM, Nic Larson <niklar at gmail.com> wrote:
Need to buy fast computer for running R on. Today we use 2,8 MHz intel D cpu and the calculations takes around 15 days. Is it possible to get the same calculations down to minutes/hours by only changing the hardware? Should I go for an really fast dual 32 bit cpu and run R over linux or xp or go for an quad core / 64 bit cpu? Is it effective to run R on 64 bit (and problem free (running/installing))??? Have around 2000-3000 euro to spend
Faster machines won't do that much. Without knowing what methods and algorithms you are running, I bet you a beer that it can be made twice as fast by just optimizing the code. My claim applies recursively. In other words, by optimizing the algorithms/code you can speed up things quite a bit. From experience, it is not unlikely to find bottlenecks in generic algorithms that can be made 10-100 times faster. Here is *one* example illustrating that even when you think the code is "fully optimized" you can still squeeze out more: http://wiki.r-project.org/rwiki/doku.php?id=tips:programming:code_optim2 So, start profiling your code to narrow down the parts that takes most of the CPU time. help(Rprof) is a start. There is also a Section 'Profiling R code for speed' in 'Writing R Extensions'. Good old verbose print out of system.time() also helps. My $.02 ...or 2000-3000USD if it was bounty?! ;) /Henrik
Thanx for any tip
[[alternative HTML version deleted]]
______________________________________________ R-help at r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.
______________________________________________ R-help at r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.
Dr. Eberhard W. Lisse \ / Obstetrician & Gynaecologist (Saar) el at lisse.NA el108-ARIN / * | Telephone: +264 81 124 6733 (cell) PO Box 8421 \ / Please do NOT email to this address Bachbrecht, Namibia ;____/ if it is DNS related in ANY way