I've inherited a project where the application's data model is an XML document. The developers before me had created an object model based on this xml's schema, and then coded against the object model.
After several years of maintenance, this application has gradually started to show its age. The team leader has said that the key reason behind this is due to the 'slowness' of xml serialization. I'm tempted to call BS on this, but many of the xml files we deal with are over 2MB in size, and keeping in mind the basics of what goes on behind the scenes with objects marked [Serializable]
, 2MB is a lot to reflect over so there might be some truth to the slowness theory.
In your experience, is serialization really so 'slow'/bad as to opt for an XML -> XPath model instead of a XML -> POCO model?
BTW this is a .NET 2.0 project, and our clients might be upgrading to .NET 3.5 sometime late next year.
In general, no, I don't think the slowdown is due to the XML Serialization; 2MB isn't that large, and it shouldn't be causing any major slowdown.
What I'd be more concerned about is the team leader telling you what the slowdown is due to without giving you any specific profiling information SHOWING you that that's the case. Opinions about optimization are frequently wrong; profiling exists for the purpose of precisely finding where any slowdown is going on in an app. I'd recommend instrumenting and profiling the app, and finding where the slowdown is; I'd bet it's not in the XML Serialization.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With