Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Workaround to R memory leak with XML package

I am trying to run some simple program to extract tables from html code. However, there seems to be some memory issue with readHTMLTable in XML package. Is there any way I could just work around this easily. Like somehow specifying some special memory for this command and then freeing it manually.

I have tried to put this in a function and tried to use gc() and different versions of R and this package and nothing seems to work. I start to get desperate.

Example code. How to run this without exploding memory size?

library(XML)
a = readLines("http://en.wikipedia.org/wiki/2014_FIFA_World_Cup")
while(TRUE) {
    b = readHTMLTable(a)
    #do something with b
}

Edit: Something like this still takes all of my memory:

library(XML)
a = readLines("http://en.wikipedia.org/wiki/2014_FIFA_World_Cup")
f <- function(x) {
    b = readHTMLTable(x)
    rm(x)
    gc()
    return(b)
}

for(i in 1:100) {
    d = f(a)
    rm(d)
    gc()
}
rm(list=ls())
gc()

I am using win 7 and tried with 32bit and 64bit.

like image 799
Pekka Avatar asked Nov 11 '22 05:11

Pekka


1 Answers

As of XML 3.98-1.4 and R 3.1 on Win7, this problem can be solved perfectly by using the function free(). But it does not work with readHTMLTable(). The following code works perfectly.

library(XML)
a = readLines("http://en.wikipedia.org/wiki/2014_FIFA_World_Cup")
while(TRUE){
   b = xmlParse(paste(a, collapse = ""))
   #do something with b
   free(b)
}

The xml2 package has similar issues and the memory can be released by using the function remove_xml() followed by gc().

like image 55
Peter Avatar answered Nov 15 '22 05:11

Peter