Hi,
I have a matrix with 700.000 x 10.000 cells with floating point data.
I would like to work with the entire table but I have a lot of memory
problems. I have read the ?memory
I work with Win 2000 with R2.1.0
The only solution that I have applied is:
memory.limit(size=2048)
But now my problems are:
- I need to work with more than 2 Gb. How I can exceed this limit?
- When apply some algorithms, the maximum cells in one object 2*10^9
(aprox.) is reached.
Please could you send me some advises/strategies about the work with
large amount of data in R?
R have a way to work with less memory needs?
Thanks in advance,