Skip to content

RMySQL query: why result takes so much memory in R ?

3 messages · Duncan Murdoch, Christoph Lehmann, Christian Schulz

#
Christoph Lehmann wrote:
Those fields are each 8 or 20 bytes in size, so you're talking 12e6 
times 36 or about nearly half a Gig for each copy.  Presumably the code 
is storing more than one or two copies of the data.

Why don't you use fetch() to get your records in more manageable chunks?

Duncan Murdoch
#
Hi
I just started with RMySQL. I have a database with roughly 12 millions 
rows/records and 8 columns/fields.

 From all 12 millions of records I want to import 3 fields only.
The fields are specified as:id int(11), group char(15), measurement 
float(4,2).
Why does this take > 1G RAM? I run R on suse linux, with 1G RAM and with 
the code below it even fills the whole 1G of swap. I just don't 
understand how 12e6 * 3 can fill such a huge range of RAM? Thanks for 
clarification and potential solutions.


## my code
library(RMySQL)
drv <- dbDriver("MySQL")
ch <- dbConnect(drv,dbname="testdb",
                 user="root",password="mysql")
testdb <- dbGetQuery(ch,
        "select id, group, measurement from mydata")
dbDisconnect(ch)
dbUnloadDriver(drv)

## end of my code

Cheers
Christoph
#
Hi,

IMHO  you need only when your columns are numeric
X rows /100/100/8  MB.
 >>(12000000*3)/100/100/8
[1] 450

But one of your columns  is  group char.
I'm suffering in the past in lot of things with massive data and R and
recognize doing how many as possible in the database, or you have to
upgrade your computer to 2-4GB like a database machine!?

regards, christian


Christoph Lehmann schrieb: