I'm trying to do a dump to XML of a very large database (many gigabytes). I'm using Linq-to-SQL to get the data out of the database and Linq-to-XML to generate XML. I'm using XStreamingElement to keep memory use low. The job still allocates all available memory, however, before keeling over without having written any XML. The structure looks like this:
var foo =
new XStreamingElement("contracts",
<LinqtoSQL which fetches data>.Select(d =>
new XElement("contract",
... generate attributes etc...
using (StreamWriter sw = new StreamWriter("contracts.xml"))
{
using (XmlWriter xw = XmlWriter.Create(sw))
{
foo.WriteTo(xw);
}
}
I've also tried saving with:
foo.Save("contracts.xml", SaveOptions.DisableFormatting);
...to no avail.
Any clues?
How complex is the data? I'm not overly familiar with XStreamingElement, but I wonder if you might have more joy using XmlWriter directly? Especially for like data in a loop, it can be used pretty easily.
I would, however, have concerns over xml as the choice for this data. Is this s requirement? Or simply a convenient available format? In particular, it can be hard to parse that size of xml conveniently, as you'd have to use XmlReader (which is harder to get right than XmlWriter).
If you can use other formats, I'd advise it... a few leap to mind, but I won't babble on unless you mention that you'd be interested.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With