Skip to content

Big matrix memory problem

6 messages · s104100026, Brian Ripley, PIKAL Petr +3 more

#
Hi All,

I want to read 256 1000x1000 matrices into R. I understand that it is unlikely 
that I can do this but In the hope that somebody can help me I am mailing this 
list.

I have tried increasing my memory size (I understand that it is the minimum of 
1024 or the computers RAM in my case 512)

Does anyone think this is possible in R, could it be tried in Splus for 
example.

Any help is greatly appreciated.

Niall Fitzgerald
Phd Candidate.
#
On Fri, 13 May 2005, s104100026 wrote:

            
What sort of matrix?  If these are real numbers that is 2Gb.
It is possible in R, but you will need a 64-bit version of R.  Since you
don't state your OS, my guess is it is Windows or MacOS, for neither of
which we have a 64-bit port as yet.
#
Hallo
On 13 May 2005 at 13:38, s104100026 wrote:

            
Why do you think that.

I easilly read files from data logging equipment.
[1] 2808000 items

just by read.table() in few whiles on quite old (4 years) and not 
superbly equipped PC (1G memory). 

So you can read your matrices sequentially. But if you want to 
work with all 256 matrices at once it could be problem.

Depends on what you want to do with them.
Cheers
Petr
Petr Pikal
petr.pikal at precheza.cz
#
s104100026 wrote:

            
Given the matrices are numeric, you will need 256*1000*1000*8 = 2Gb of 
memory just to hold them in memory, in order to apply calculations, 
objects are frequently doubled .......
So you should really handle those matrices separately, either by getting 
them from a database or by saving them in form of separate Rdata objects.

Uwe Ligges
#
On 5/13/05, s104100026 <n.d.fitzgerald at mars.ucc.ie> wrote:
If they are sparse you could try the SparseM package.
#
S-Plus 7 advertises facilities for large data sets 
(http://www.insightful.com/products/splus/default.asp#largedata).  Their 
web site says they do this with "New Pipeline Architecture" that 
"streams large data sets through available RAM instead of reading the 
entire data set into memory at once."  It also "includes a new data type 
for dealing with very large data objects".  If you want more than this, 
I suggest you post to "S-News List <s-news at lists.biostat.wustl.edu>";  I 
haven't used it.

	  hope this helps.
	  spencer graves
Gabor Grothendieck wrote: