Skip to content
Prev 5796 / 15075 Next

Is R more heavy on memory or processor?

I agree with Dan, memory will often be the limiting
factor.  I added RAM (16GB total) to my ppc and have
had a much more productive environment, both for
32 bit and 64 bit applications.

Even if a single R session cannot benefit from multiple
cores, if you can break your processes into parallel
pieces you can use your separate CPUs with cluster
software, or just run multiple R jobs manually.

I'd recommend maximizing your RAM quantity over
RAM speed.  Also, determine the speed gain.
Speed gains of 10-fold or more are noticeable,
speed gains of 2 to 3 fold rarely make much of a 
difference.

Steven McKinney, Ph.D.

Statistician
Molecular Oncology and Breast Cancer Program
British Columbia Cancer Research Centre

email: smckinney +at+ bccrc +dot+ ca

tel: 604-675-8000 x7561

BCCRC
Molecular Oncology
675 West 10th Ave, Floor 4
Vancouver B.C. 
V5Z 1L3
Canada




-----Original Message-----
From: r-sig-mac-bounces at stat.math.ethz.ch on behalf of Dan Putler
Sent: Tue 3/24/2009 12:08 PM
To: Booman, M
Cc: R-SIG-Mac
Subject: Re: [R-SIG-Mac] Is R more heavy on memory or processor?
 
Hi Marije,

Personally, I would be more concerned with memory than processor.
Running out of memory can be an unpleasant surprise. Base R uses a
single core, but Simon Urbanek's multicore package (the most recent
version of which, 0.1-3, is dated today) does allow you to use multiple
cores at once. I haven't used this package, so can't offer any personal
experience.

Dan
On Tue, 2009-03-24 at 19:55 +0100, Booman, M wrote: