I am having a large xml file which contains many sub elements. I want to able to run some xpath queries. I tried using vtd-xml in java, but I get outofmemory error sometimes, because the xml is so large to fit into memory. Is there an alternative way of processing such large xml's.
try http://code.google.com/p/jlibs/wiki/XMLDog
it executes xpaths using sax without creating in-memory representation of xml documents.
SAXParser is very efficient when working with large files
What are you trying to do right now? By the sounds of it you are trying to use a DOM based parser, which essentially loads the entire XML file into memory as a DOM representation. If you are dealing with a large file, you'll better off using a SAX parser, which processes the XML document in a streaming fashion.
I personally recommend StAX for this.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With