read.table segfaults
On Fri, Aug 26, 2011 at 9:41 PM, Scott <ncbi2r at googlemail.com> wrote:
It does look like you've got a memory issue. perhaps using ?as.is=TRUE, and/or stringsAsFactors=FALSE will help as optional arguments to read.table if you don't specify these sorts of things, R can have to look through the file and figure out which columns are characters/factors etc and so the larger files cause more of a headache for R I'm guess. Hopefully someone else can comment further on this? I'd true toggling TRUE/FALSE for as.is and stringsAsFactors. ? do you have other objects loaded in memory as well? this file by itself might not be the problem - but it's a cumulative issue. ? have you checked the file structure in any other manner? ? how large (Mb/kb) is the file that you're trying to read? ? if you just read in parts of the file, is it okay? ? ? ?read.table(filename,header=FALSE,sep="\t",nrows=100) ? ? ?read.table(filename,header=FALSE,sep="\t",skip=20000,nrows=100)
Today, after a night's sleep, there are no segfaults! (The computer also slept, I turned it off.) So what is going on? Maybe I shouldn't bother.... but I installed the latest patched version yesterday, immediately tried to read the file with a segfault as a result, turned the machine off and on, and no problems. Do we need to reboot after a new install (note, this is not Windows)? G?ran
-- View this message in context: http://r.789695.n4.nabble.com/read-table-segfaults-tp3771793p3771817.html Sent from the R devel mailing list archive at Nabble.com.
______________________________________________ R-devel at r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
G?ran Brostr?m