Skip to Content

write.table in a certain loop never releases memory

Consider these two pieces of code. In the first one, things work normally, and the memory usage of R is stable:

for (i in 1:100) {
x <- rnorm(1000000)
write.table(x, file="test", col.names=F, append=T)
}

Now consider this related code, where I am scraping information from the World Bank about some economic indicator. Here, the memory usage goes up as the loop is iterated:

library(RCurl)
library(XML)
for (i in 1:26) {
x <- getURL(paste("http://api.worldbank.org/countries/all/indicators/AG.AGR.TRAC.NO?per_page=500&date=1960:2012&page=", as.character(i), sep=""))
x <- xmlToDataFrame(x)
write.table(x, file="test", col.names=F, append=T)
}

What is the difference between these two snippets from the point of view of writing data, and how can I ensure that the second one releases memory properly? Thanks!!