Skip to content

memory usage grows too fast

1 message · Ping-Hsun Hsieh

#
Thanks for Peter, William, and Hadley's helps.
Your codes are much more concise than mine.  :P
 
Both William and Hadley's comments are the same. Here are their codes.

	f <- function(dataMatrix) rowMeans(datamatrix=="02")

And Peter's codes are the following.

	apply(yourMatrix, 1, function(x) length(x[x==yourPattern]))/ncol(yourMatrix)


In terms of the running time, the first one ran faster than the later one on my dataset (2.5 mins vs. 6.4 mins)
The memory consumption, however, of the first one is much higher than the later.  ( >8G vs. ~3G )

Any thoughts? My guess is the rowMeans created extra copies to perform its calculation, but not so sure.
And I am also interested in understanding ways to handle memory issues. Help someone could shed light on this for me. :)

Best,
Mike

-----Original Message-----
From: Peter Alspach [mailto:PAlspach at hortresearch.co.nz] 
Sent: Thursday, May 14, 2009 4:47 PM
To: Ping-Hsun Hsieh
Subject: RE: [R] memory usage grows too fast

Tena koe Mike

If I understand you correctly, you should be able to use something like:

apply(yourMatrix, 1, function(x)
length(x[x==yourPattern]))/ncol(yourMatrix)

I see you've divided by nrow(yourMatrix) so perhaps I am missing
something.

HTH ...

Peter Alspach
The contents of this e-mail are confidential and may be ...{{dropped:14}}