Skip to content
Prev 1014 / 2152 Next

Handling data with thousands of variables

Ok, but what about memory usage. For now I have implemented my
analysis in python with numpy arrays with only 100 000 cases and 10
000 keywords.
But the memory required for large array and matrix is massive. In R
one possibility is the Bigmemory library,
but it's slow and if I remember correctly the bigmemory matrix is not
supported by other R libraries.

-H?vard
On Fri, Jul 1, 2011 at 10:02 AM, Han De Vries <handevries at gmail.com> wrote: