Dear R users, I have been using a dynamic data extraction from raw files strategy at the moment, but it takes a long long time. In order to save time, I am planning to generate a data set of size 1500 x 20000 with each data point a 9-digit decimal number, in order to save my time. I know R is limited to 2^31-1 and that my data set is not going to exceed this limit. But my laptop only has 2 Gb and is running 32-bit Windows / XP or Vista. I ran into R memory problem issue before. Please let me know your opinion according to your experience. Thanks a lot! - John
Too large a data set to be handled by R?
3 messages · tsunhin wong, jim holtman, Stavros Macrakis
An embedded and charset-unspecified text was scrubbed... Name: not available URL: <https://stat.ethz.ch/pipermail/r-help/attachments/20090520/4aed0aa8/attachment-0001.pl>
An embedded and charset-unspecified text was scrubbed... Name: not available URL: <https://stat.ethz.ch/pipermail/r-help/attachments/20090520/5751fc21/attachment-0001.pl>