Skip to content

Memory getting eaten up with XML

2 messages · Andrew Gormley

#
Hi all. I have an issue that I cannot resolve. I am trying to read in lots of data that are stored in xml files. But after I read them in and copy the relevant data, then remove the document etc, it doesn't free up the memory. When I monitor it in windows task manager the memory usage just climbs with each iteration until R crashes. I can replicate the problem with the small example:
        file.name<-"C:\\MyData.xml.gz"
        TEMPP<-xmlParse(file.name)
        xx <- xmlRoot(TEMPP)
        rm(xx)
        rm(TEMPP)
        gc()

Even though I remove the root node xx and the document TEMPP, the memory usage remains the same as it was when I first read it in... Any ideas/solutions?
I am using a 32bit version of R 2.14.0 on windows XP, and the latest version of XML (3.6.1).
Many thanks
Andrew (apologies for the large footer my work appends to all my emails...)

Please consider the environment before printing this email
Warning:  This electronic message together with any attachments is confidential. If you receive it in error: (i) you must not read, use, disclose, copy or retain it; (ii) please contact the sender immediately by reply email and then delete the emails.
The views expressed in this email may not be those of Landcare Research New Zealand Limited. http://www.landcareresearch.co.nz
1 day later
#
Today I tried the code on a MacBook and experienced the same problem. Which
makes me think there is something wrong with the way I am trying to free up
the memory...?
Andrew


Andrew Gormley wrote
--
View this message in context: http://r.789695.n4.nabble.com/Memory-getting-eaten-up-with-XML-tp4163468p4168098.html
Sent from the R help mailing list archive at Nabble.com.