Skip to content
Prev 42602 / 398506 Next

problems with large data II

If you can't get more memory, you could read portions of the file 
using "scan(..., skip = ..., nlines = ...)" and then compress the data 
somehow to reduce the size of the object you pass to "randomForest".  
You could run "scan" like this in a loop each time processing, e.g., 10% 
of the data file. 

      Alternatively, you could pass each portion to "randomForest" and 
compare the results from several calls to "randomForest".  This would 
produce a type of cross validation, which might be a wise thing to do, 
anyway. 

      hope this helps. 
      spencer graves
PaTa PaTaS wrote: