Skip to content

simulating data

3 messages · Uwe Ligges, Christian Schulz

#
Christian Schulz wrote:
In principle, yes. If the memory won't get too segmented it will work, I
think.
Simulating one row after another will certainly result in a huge speed
penalty, but maybe one block of rows after another is the solution you
are looking for.

Uwe Ligges
#
..the speed penalty is not really bad, because
i need the data one time, so it can run over night!
I'm just starting experiment with loop's
and thinks that's a good exercise...

..but i need the lot of rows to test
the mysql speed penalty  :-)

regards,christian



----- Original Message -----
From: "Uwe Ligges" <ligges at statistik.uni-dortmund.de>
To: "Christian Schulz" <ozric at web.de>
Cc: <r-help at stat.math.ethz.ch>
Sent: Sunday, May 11, 2003 8:02 PM
Subject: Re: [R] simulating data
simulate.
memory-erros go about the  1000.000 border , but
possibilty simulate a lot of single rows, one after
them for memory-recover from R ?
as.factor(klient$VermittlungskriteriumA)
as.factor(klient$VermittlungskriteriumB)
c("<3Monate","<6Monate","<12Monate","<18Monate","=>24Monate",)