Martin M has suggested I widen this discussion to R-devel, and
I agree that we should increase them, but I'm not sure at all about the amount. The default could even depend on the architecture (via "./configure")..
Views, please. ------------- Begin Forwarded Message ------------- Is is not time we increased the defaults a bit? As the base gets bigger I hit 200k cons cells rather frequently. And 2Mb of heap seems low compared to 3Mb of cons cells and 1.8Mb for the R binary. How little memory do people have these data? Except possibly on Windows and old teaching labs I would have thought using 15Mb default for R was very reasonable, which is about --vsize 6Mb --nsize 300k On Solaris that gives: PID USERNAME THR PRI NICE SIZE RES STATE TIME CPU COMMAND 9308 ripley 1 -25 0 15M 8360K sleep 0:02 4.35% R.binary as against the default PID USERNAME THR PRI NICE SIZE RES STATE TIME CPU COMMAND 9309 ripley 1 -25 0 9664K 6416K sleep 0:02 4.66% R.binary and that extra user memory makes a lot of difference. ------------- End Forwarded Message -------------
Brian D. Ripley, ripley@stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272860 (secr) Oxford OX1 3TG, UK Fax: +44 1865 272595 -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-devel-request@stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._