Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

SQL Server 2008 Error - XML parsing: document parsing required too much memory

We've hit the wall in SQL server 2008 on the maximum number of attributes that a single XML node can have before the XML parser crashes.

The error we're receiving is:

Msg 6303, Level 16, State 1, Line 1
XML parsing: Document parsing required too much memory

Which is a little misleading. Out problem is happening when converting a string into an XML data type (or table column).

SELECT CONVERT(XML, '<DataNode><Data attr1="a" attr2="b" XXXXX /></DataNode>')

Where XXXXX is actually another 8191 attributes.

In total our dataset contained 10,066 attributes to begin with. When we reduce the number of attributes down to 8,192, it works fine. 8,193 attributes crashes however.

It doesn't seem to have anything specifically to do with the size of the data (100MB or 60KB it doesn't matter - we get the same fail/success based on attribute count)

So, is there anything that can be done with SQL server to change this limitation?

Our C# application does not have this limitation so perfectly valid XML documents in C# cannot be stored in SQL server's XML data columns.

Any help anyone can offer would be greatly appreciated. The data-structure cannot be changed at this point as it would require a re-write of the data-processing functionality of an entire application framework with hundreds of components.

PS: I have already advised the management of how ridiculous the situation is when an application stores data as attributes of a single node rather than as a tree, but this is what I have to work with :-(

edit: we've tried this on SQL Server 2008 32 and 64-bit versions on a server with 2GB RAM and a server with 32GB RAM - all versions and environments have the same issue.

UPDATE:

I've tried this in SQL server 2012, and it fails when there are 8,193 attributes AND the length of the string is also over a given size (the length of the test string is 833K), but works when the same lengths string only has 8,192 attributes.

However I have a much shorter string (193K) with 12,000 attributes and it works in SQL Server 2012, and SQL server 2008)

So, it appears to be a combination of the number of attributes when the string being cast is over a certain size. This gets more interesting!

Thanks for the feedback so far!

Update 2:

After further testing with smaller strings (270K) I still hit the attribute limit at 16,384... 16,385 attributes fails! So, it's definitely happening in increments of 8K attributes, depending on the combination of string length!!

like image 857
doublehelix Avatar asked Aug 15 '13 06:08

doublehelix


1 Answers

8191 attributes sounds too much. If looks you are trying to store actual data in the attributes - apparently SQL Server parser has a limitation here.

see article here: http://blogs.msdn.com/b/sqlprogrammability/archive/2006/05/23/605299.aspx?Redirected=true

When a type is needed for validation, the validator loads its definition from metadata and compiles it into a format suitable for quick validation. In order to prevent any one type from using too much memory, SQL Server caps the size of a compiled type at one megabyte. SQL Server compiles all types and performs this check when the schema is imported in order to avoid accepting types which exceed the limit.

I would suggest changing the way you use XML and store you information inside element.

see http://www.ibm.com/developerworks/library/x-eleatt/index.html and http://www.w3schools.com/xml/xml_attributes.asp

like image 126
AlexTheDeveloper Avatar answered Oct 12 '22 23:10

AlexTheDeveloper