Skip to content
Back to formatted view

Raw Message

Message-ID: <fcd289fd0508291500496828e5@mail.gmail.com>
Date: 2005-08-29T22:00:32Z
From: Ferran Carrascosa
Subject: memory

Hi,

I have a matrix with 700.000 x 10.000 cells with floating point data.
I would like to work with the entire table but I have a lot of memory
problems. I have read the ?memory
I work with Win 2000 with R2.1.0

The only solution that I have applied is:
> memory.limit(size=2048)

But now my problems are:
- I need to work with more than 2 Gb. How I can exceed this limit?
- When apply some algorithms, the maximum cells in one object 2*10^9
(aprox.) is reached.

Please could you send me some advises/strategies about the work with
large amount of data in R?

R have a way to work with less memory needs?

Thanks in advance,
-- 
Ferran Carrascosa