Skip to content
Back to formatted view

Raw Message

Message-ID: <3d3mvvo8b3fkece7lt5crg08osleem2jn3@4ax.com>
Date: 2004-01-06T19:43:22Z
From: Duncan Murdoch
Subject: Problem reading large tables
In-Reply-To: <20040106190343.GB21560@lysine.umiacs.umd.edu>

On Tue, 6 Jan 2004 14:03:47 -0500, Daniel Sumers Myers
<dmyers at umiacs.umd.edu> wrote :

>Hi, 
>	I'm trying to read in a fairly large (92 observations by 3680 variables)
>table into R from a space-delimited text file (attached) using the command: d8
><- read.table('d8.r', header=T). The function call runs to completion, and I
>get back a valid table object. However, starting at column 999, the table
>records the value TRUE when it should record T (T's in columns 998 and earlier
>are fine). I've looked at the data file, and I can see no difference between
>(e.g.) the T at position 998 in row 1 and the T in position 999 in row 1, yet
>998 is recorded as T and 999 as TRUE. 

The special-looking value 999 is probably just a coincidence.  Likely
what happened is that column 999 was the first column that looked to
the type.convert function like   a purely logical column (because all
values are T there?).  You can tell R not to automatically convert
values by using the colClasses argument to read.table, e.g. colClasses
= "character" forces everything to stay as a character.

Duncan Murdoch

P.S. You can't send attachments to the mailing list, so I didn't see
your data file.