Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Table Valued Constructor Maximum rows limit in Select

I have a Table Valued Constructor through which am Selecting around 1 million records. It will be used to update another table.

SELECT *
FROM   (VALUES (100,200,300),
               (100,200,300),
               (100,200,300),
               (100,200,300),
               .....
               ..... --1 million records
               (100,200,300)) tc (proj_d, period_sid, val) 

Here is my original query : https://www.dropbox.com/s/ezomt80hsh36gws/TVC.txt?dl=0#

When I do the above select it is simply showing Query completed with errors with showing any error message.

enter image description here

Update : Tried to catch the error message or error number using TRY/CATCH block but no use still same error as previous image

BEGIN try
    SELECT *
    FROM   (VALUES (100,200,300),
                    (100,200,300),
                    (100,200,300),
                    (100,200,300),
                    .....
                    ..... --1 million records
                    (100,200,300)) tc (proj_d, period_sid, val) 
END try

BEGIN catch
    SELECT Error_number(),
           Error_message()
END catch 

Why it is not executing is there any limit for Table Valed constructor in Select. I know for Insert it is 1000 but am selecting here.

like image 967
Pரதீப் Avatar asked Nov 07 '16 08:11

Pரதீப்


People also ask

How can I insert more than 1000 rows in SQL?

A table can store upto 1000 rows in one insert statement. If a user want to insert multiple rows at a time, the following syntax has to written. If a user wants to insert more than 1000 rows, multiple insert statements, bulk insert or derived table must be used.

How many rows should a table have at maximum?

1 byte is used to store the slot number, a single page can have at most 255 slots/rows. The maximum number of rows in a table or fragment is 4,278,189,825. These are all theoretical limits.

How many maximum column can a table have?

Answer. For the columns in a table, there is a maximum limit of 1024 columns in a table. SQL Server does have a wide-table feature that allows a table to have up to 30,000 columns instead of 1024.


1 Answers

There's no relevant hard coded limit (65,536 * Network Packet Size of 4KB is 268 MB and your script length is nowhere near that) though it is inadvisable to use this method for a large amount of rows.

The error you are seeing is thrown by the client tools not SQL Server. If you construct the SQL String in dynamic SQL compilation is able to at least start successfully

DECLARE @SQL NVARCHAR(MAX) = '(100,200,300),
';

SELECT @SQL = 'SELECT * FROM (VALUES ' + REPLICATE(@SQL, 1000000) + '
(100,200,300)) tc (proj_d, period_sid, val)';

SELECT @SQL AS [processing-instruction(x)]
FOR XML PATH('')

SELECT DATALENGTH(@SQL) / 1048576.0 AS [Length in MB] --30.517705917

EXEC(@SQL);

Though I killed the above after ~30 minutes compilation time and it still hadn't produced a row. The literal values need to be stored inside the plan itself as a table of constants and SQL Server spends a lot of time trying to derive properties about them too.

SSMS is a 32 bit application and throws a std::bad_alloc exception whilst parsing the batch

enter image description here

It tries to push an element onto a vector of Token that has reached capacity and its attempt to resize fails due to unavailability of a large enough contiguous area of memory. So the statement never even makes it as far as the server.

The vector capacity grows by 50% each time (i.e. following the sequence here). The capacity that the vector needs to grow to depends on how the code is laid out.

The following needs to grow from a capacity of 19 to 28.

SELECT * FROM 
(VALUES 
(100,200,300),
(100,200,300),
(100,200,300),
(100,200,300),
(100,200,300),
(100,200,300)) tc (proj_d, period_sid, val)

and the following only needs a size of 2

SELECT * FROM (VALUES (100,200,300),(100,200,300),(100,200,300),(100,200,300),(100,200,300),(100,200,300)) tc (proj_d, period_sid, val)

The following needs a capacity of > 63 and <= 94.

SELECT *
FROM   (VALUES 
      (100,
       200,
       300),
      (100,
       200,
       300),
      (100,
       200,
       300),
      (100,
       200,
       300),
      (100,
       200,
       300),
      (100,
       200,
       300)
       ) tc (proj_d, period_sid, val) 

For a million rows laid out as in case 1 the vector capacity needs to grow to 3,543,306.

You might find that either of the following will allow the client side parsing to succeed.

  1. Reduce the number of line breaks.
  2. Restarting SSMS in the hope that the request for large contiguous memory succeeds when there is less address space fragmentation.

However even if you do successfully send it to the server it will only end up killing the server during execution plan generation anyway as discussed above.

You'll be much better off using the import export wizard to load the table. If you must do it in TSQL you'll find breaking it into smaller batches and/or using another method such as shreding XML will perform better than Table Valued Constructors. The following executes in 13 seconds on my machine for example (though if using SSMS you'd still likely have to break up into multiple batches rather than pasting a massive XML string literal).

DECLARE @S NVARCHAR(MAX) = '<x proj_d="100" period_sid="200" val="300" />
' ; 

DECLARE @Xml XML = REPLICATE(@S,1000000);

SELECT 
    x.value('@proj_d','int'),
    x.value('@period_sid','int'),
    x.value('@val','int')
FROM @Xml.nodes('/x') c(x)
like image 82
Martin Smith Avatar answered Sep 27 '22 19:09

Martin Smith