I have a large database script migrating multiple databases of the same structure to one destination database. This destination database is more generic so it is able to store the data from the different source databases. I use a variable to keep track of the current Entity being migrated so I know what ID to insert in the destination table.
At this moment the migration performance is really bad. To be able to profile the script better I'd like to split up the script by placing 'go' after each table migration but this destroys the variable. Is there a way to declare a variable that is accessible for the whole connection/session? Just like a temp #table is?
Query:
DECLARE
@UserID TINYINT = 1
, @LocaleID INT = 123456789
, @ApplicationID BIGINT = 123456789123456789
, @UserName VARCHAR(10) = 'User1'
, @context VARBINARY(128)
SELECT @context =
CAST(@UserID AS BINARY(1)) +
CAST(@LocaleID AS BINARY(4)) +
CAST(@ApplicationID AS BINARY(8)) +
CAST(@UserName AS BINARY(10))
SET CONTEXT_INFO @context
GO
SELECT
UserID = CAST(SUBSTRING(ci, 1, 1) AS TINYINT)
, LocaleID = CAST(SUBSTRING(ci, 2, 4) AS INT)
, ApplicationID = CAST(SUBSTRING(ci, 6, 8) AS BIGINT)
, UserName = CAST(SUBSTRING(ci, 14, 10) AS VARCHAR)
FROM (SELECT ci = CONTEXT_INFO()) t
Result:
UserID LocaleID ApplicationID UserName
----------- ----------- ------------------ ------------------------------
1 123456789 123456789123456789 User1
Additional info:
MSDN - CONTEXT_INFO
With SQL Server 2016
you can use sp_set_session_context:
EXEC [sys].[sp_set_session_context] @key = 'SecurityObjectUserID'
,@value = @SecurityObjectUserID
,@read_only = 1;
to set a variable, and the following to read it:
SELECT @SecurityObjectUserID = CONVERT(BIGINT,SESSION_CONTEXT(N'SecurityObjectUserID'));
Note, we can mark a variable to be read_only
. In this way, other routines are not able to change it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With