Skip to content

Help, please

4 messages · Julio César Flores Castro, Brian Ripley, David Winsemius +1 more

#
On Wed, 18 May 2011, Julio C?sar Flores Castro wrote:

            
At least millions.
So you are not asking about R but about a contributed package. 
Clearly you have not read the posting guide (you sent HTML), so please 
do so and follow its advice.  That amounts to preparing a documented 
and reproducible example and sending it to the package maintainer in a 
properly signed email which makes clear your affiliation (which 
appears to be a company).

  
    
#
On May 18, 2011, at 6:29 PM, Julio C?sar Flores Castro wrote:

            
I was able to handle (meaning do Cox proportional hazards work with  
the 'rms' package which adds extra memory overhead with a datadist  
object)  a 5.5 million rows by 100 columns dataframe without  
difficulty using 24 GB on a Mac (BSD UNIX kernel). I was running into  
performance slow downs related to paging out to virtual memory at 150  
columns, but after expanding to 32 GB can now handle 5.5 MM records  
with 200 columns without paging.
64 bit OS, 64 bit R,  and more memory.
#
----------------------------------------
The longer term solution is implementation and algorithm designed to
increase coherence
of memory accesses ( firefox is doing this to me now dropping every few chars and
?getting 
many behind as it thrashes with memory leak, LOL).