Memory problems, HDF5 library and R-1.2.2 garbage collection
On 22 Mar 2001, Marcus G. Daniels wrote:
"NEN" == Norberto Eiji Nawa <eiji at isd.atr.co.jp> writes:
NEN> When I try to load a single 50MB HDF5 file, the computer chokes NEN> before completing the job as well. I'll check this out and make sure there isn't gratuitous waste happening. Problems with the big file sound plausible, but the smaller chunks should be doable. Thanks for the test cases, btw..
I'm using R 1.2.2 to read in large netCDF files. I've read in about a tenth of some 220MB netcdf files. I've processed 82 of these files in a row without restarting R and not had a problem with memory. So obviously R can read in large datasets, it must be a problem with the HDF module. I know that the netCDF 1.2 library is very inefficient at some things, and I've pretty much totally re-written it. -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._