How to use all the cores while running glmer on a piecewise exponential survival with
That's the plan, the real question is how big should the samples be. The faster we can estimate the model, the bigger the sample can be. If I can run the model on multiple cores that would significantly increase the sample size.
On Thu, Aug 23, 2018 at 12:23 PM Doran, Harold <HDoran at air.org> wrote:
One idea, though, is you can take samples from your very large data set
and estimate models on the samples very quickly.
-----Original Message-----
From: R-sig-mixed-models <r-sig-mixed-models-bounces at r-project.org> On
Behalf Of Adam Mills-Campisi
Sent: Thursday, August 23, 2018 3:18 PM
To: r-sig-mixed-models at r-project.org
Subject: [R-sig-ME] How to use all the cores while running glmer on a
piecewise exponential survival with
I am estimating a piecewise exponential, mixed-effects, survival model
with recurrent events. Each individual in the dataset gets an individual
interpret (where using a PWP approach). Our full dataset has 10 million
individuals, with 180 million events. I am not sure that there is any
framework which can accommodate data at that size, so we are going to
sample. Our final sample size largely depends on how quickly we can
estimate the model, which brings me to my question: Is there a way to
mutli-thread/core the model? I tried to find some kind of instruction on
the web and the best lead I could find was a reference to this list serve.
Any help would be greatly appreciated.
[[alternative HTML version deleted]]
_______________________________________________ R-sig-mixed-models at r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-sig-mixed-models