Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is there a way around the 8k row length limit in SQL Server?

First off, I know that in general having large numbers of wide columns is a bad idea, but this is the format I'm constrained to.

I have an application that imports CSV files into a staging table before manipulating them and inserting/updating values in the database. The staging table is created on the fly and has a variable number of NVARCHAR colums into which the file is imported, plus two INT columns used as row IDs.

One particular file I have to import is about 450 columns wide. With the 24 byte pointer used in a large NVARCHAR column, this adds up to around 10k by my calculations, and I get the error Cannot create a row of size 11166 which is greater than the allowable maximum row size of 8060.

Is there a way around this or are my only choices modifying the importer to split the import or removing columns from the file?

like image 897
Jaloopa Avatar asked May 08 '13 10:05

Jaloopa


2 Answers

You can use text/ntext which uses 16 bytes pointer. Whereas varchar/nvarchar uses 24bytes pointer.

NVARCHAR(max) or NTEXT can store the data more than 8kb but a record size can not be greater than 8kb till SQL Server 2012. If Data is not fitted in 8kb page size then the data of larger column is moved to another page and a 24 bytes(if data type is varchar/nvarchar) pointer is used to store as reference pointer in main column. if it is text/ntext data type then 16 bytes poiner is used.

like image 90
Banketeshvar Narayan Avatar answered Sep 27 '22 20:09

Banketeshvar Narayan


For Details you can Visit at following links :

Work around SQL Server maximum columns limit 1024 and 8kb record size

or

http://msdn.microsoft.com/en-us/library/ms186939(v=sql.90).aspx

like image 37
Kaushik Sharma Avatar answered Sep 27 '22 21:09

Kaushik Sharma