Hi All,
I have a txt file to read into R. The size of it is about 500MB.
This txt file is produced by calling write.table(M, file =
"xxx.txt"), where M is a large matrix
After running MM = read.table("xxx.txt"), the R gui keeps a cpu
core/thread fully occupied forever.
64 bit R with 16GB RAM on Win7 64, i5 cpu should be capable. So if
anyone knows the reason, that will be appreciated.
Thank you for any advice.
Best wishes,
Jie
read.table freezes the computer
3 messages · Jie, Ben Bolker, S Ellison
Jie <jimmycloud <at> gmail.com> writes: [snip]
I have a txt file to read into R. The size of it is about 500MB.
This txt file is produced by calling write.table(M, file =
"xxx.txt"), where M is a large matrix
After running MM = read.table("xxx.txt"), the R gui keeps a cpu
core/thread fully occupied forever.
64 bit R with 16GB RAM on Win7 64, i5 cpu should be capable.=
[snip] Take a look at http://stackoverflow.com/questions/1727772/ quickly-reading-very-large-tables-as-dataframes-in-r/ (URL broken to make gmane happy)
1 day later
-----Original Message-----
I have a txt file to read into R. The size of it is about 500MB.
This txt file is produced by calling write.table(M, file =
"xxx.txt"), where M is a large matrix After running MM =
read.table("xxx.txt"), the R gui keeps a cpu core/thread
fully occupied forever.
64 bit R with 16GB RAM on Win7 64, i5 cpu should be capable.
So if anyone knows the reason, that will be appreciated.
Thank you for any advice.
A look at the ?read.table section on 'memory usage' may help.
In particular, specifying colClasses as numeric is recommended.
S Ellison
*******************************************************************
This email and any attachments are confidential. Any use...{{dropped:8}}