I know, I know this has been done to death; Im just posting a question to see if this solution is still relevant since now we have .NET 4 and newer
This link explain a simple way to read large XML files and it implements Linq. I quite like this and just want a simple answer/s to state if this is still relevant or are there better implementations in newer .NET code.
The answer to this question hasn't changed in .NET 4 - for best performance you should still be using XmlReader as it streams the document instead of loading the full thing into memory.
The code you refer to uses XmlReader
for the actual querying so should be reasonably quick on large documents.
If it seems like this:
<root>
<item>...</item>
<item>...</item>
...
</root>
you can read file with XmlReader
and each 'item' open with XmlDocument
like this:
reader.ReadToDescendant("root");
reader.ReadToDescendant("item");
do
{
XmlDocument doc = new XmlDocument();
doc.LoadXml(reader.ReadOuterXml());
XmlNode item = doc.DocumentElement;
// do your work with `item`
}
while (reader.ReadToNextSibling("item"));
reader.Close();
In this case, you have no limits on file size.
The best way to do this is read it line by line using XmlReader.Create.
var reader = XmlReader.Create(filename);
reader.WhitespaceHandling = WhitespaceHandling.None;
while (reader.Read())
{
// your code here.
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With