Dear R-help, I am loading data sets of the size 509 x ~9000 integers. Which is a 10M file on disk. The .Rdata is 18M. When I try to use rpart I get: Error: protect(): stack overflow Naturally, the same thing happens with randomForest too. What can be done, or is this dataset really too big? thanks advance, Clayton -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
large data set issues
2 messages · clayton.springer@pharma.novartis.com, Thomas Lumley
On Wed, 25 Sep 2002 clayton.springer at pharma.novartis.com wrote:
Dear R-help, I am loading data sets of the size 509 x ~9000 integers. Which is a 10M file on disk. The .Rdata is 18M. When I try to use rpart I get: Error: protect(): stack overflow Naturally, the same thing happens with randomForest too. What can be done, or is this dataset really too big?
You could try redefining R_PPSSIZE in Defn.h and recompiling, to get a larger pointer protection stack. -thomas -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._