I have to call a MS SQLServer stored procedure with a table variable parameter.
/* Declare a variable that references the type. */
DECLARE @TableVariable AS [AList];
/* Add data to the table variable. */
INSERT INTO @TableVariable (val) VALUES ('value-1');
INSERT INTO @TableVariable (val) VALUES ('value-2');
EXEC [dbo].[sp_MyProc]
@param = @TableVariable
Works well in the SQL Sv Management studio. I tried the following in python using PyOdbc:
cursor.execute("declare @TableVariable AS [AList]")
for a in mylist:
cursor.execute("INSERT INTO @TableVariable (val) VALUES (?)", a)
cursor.execute("{call dbo.sp_MyProc(@TableVariable)}")
With the following error: error 42000 : the table variable must be declared. THe variable does not survive the different execute steps. I also tried:
sql = "DECLARE @TableVariable AS [AList]; "
for a in mylist:
sql = sql + "INSERT INTO @TableVariable (val) VALUES ('{}'); ".format(a)
sql = sql + "EXEC [dbo].[sp_MyProc] @param = @TableVariable"
cursor.execute(sql)
With the following error: No results. Previous SQL was not a query. No more chance with
sql = sql + "{call dbo.sp_MyProc(@TableVariable)}"
does somebody knows how to handle this using Pyodbc?
Now the root of your problem is that a SQL Server variable has the scope of the batch it was defined in. Each call to cursor.execute is a separate batch, even if they are in the same transaction.
There are a couple of ways you can work around this. The most direct is to rewrite your Python code so that it sends everything as a single batch. (I tested this on my test server and it should work as long as you either add set nocount on or else step over the intermediate results with nextset.)
A more indirect way is to rewrite the procedure to look for a temp table instead of a table variable and then just create and populate the temp table instead of a table variable. A temp table that is not created inside a stored procedure has a scope of the session it was created in.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With