利用XML package来读取数据,
有2000个左右的文件。
随着文件读入的越来越多,电脑的内存越耗越多。
如何解决?
google了一下,有类似的问题,【比如http://stackoverflow.com/questions/9220849/serious-memory-leak-when-iteratively-parsing-xml-files
又比如:http://r.789695.n4.nabble.com/memory-leak-using-XML-readHTMLTable-td4643332.html
谢老大还在里面吱了一声,:-) 】
但是看了上面的网页后,仍然不知道如何解决.
有人碰到过这样的问题,并解决了吗?
> sessionInfo()
R version 2.15.1 (2012-06-22)
Platform: i386-pc-mingw32/i386 (32-bit)
locale:
[1] LC_COLLATE=Chinese_People's Republic of China.936 LC_CTYPE=Chinese_People's Republic of China.936 LC_MONETARY=Chinese_People's Republic of China.936
[4] LC_NUMERIC=C LC_TIME=Chinese_People's Republic of China.936
attached base packages:
[1] stats graphics grDevices utils datasets methods base
other attached packages:
[1] lifi_0.0.1 RCurl_1.91-1.1 bitops_1.0-4.1 XML_3.9-4.1