Hi alI,
I have a 3.1 GB Dataset ( with 11 coloumns and lots data in int and
string).
If I use read.table; it takes very long. It seems that my RAM is not big
enough (overload) I have 3.2 RAM and 7GB SWAP, 64 Bit Ubuntu.
Is there a best sultion to read a large data R? I have seen, that people
suggest to use bigmemory package, ff. But it seems very complicated. I
dont
know how to start with that packages.
i have tried to use bigmemory. But I got some kind of errors. Then I gave
up.
can someone give me an simple example how ot use ff or bigmemory?or maybe
re
better sollution?
Thank you in advance,
Edwin