I have a SQL Server data table which stores a JSON string in one of its columns. The JSON string is a serialised .net object and the data typically exceeds 4000 characters.
I have a simple stored procedure which I use to retrieve the data:
@StageID int,
@Description varchar(250) = null OUTPUT,
@Program nvarchar(max) = null OUTPUT
AS
BEGIN
SET NOCOUNT ON;
SELECT @Program = StageProgram, @Description = Description
FROM StageProgram
WHERE StageID = @StageID;
RETURN 0;
END
I am using the data type nvarchar(max)
for the column. When I serialise the .net object to JSON and write it to the database using Dapper, I find that the full string is correctly stored in the database.
However, when I attempt to retrieve the string I find that it is trimmed to 4000 characters, discarding the rest of the data.
Here is the relevant code:
DynamicParameters p = new DynamicParameters();
p.Add("@StageID", Properties.Settings.Default.StageID, DbType.Int32, ParameterDirection.Input);
p.Add("@Description", "", DbType.String, direction: ParameterDirection.Output);
p.Add("@Program", "", DbType.String, direction: ParameterDirection.Output);
p.Add("@ReturnValue", DbType.Int32, direction: ParameterDirection.ReturnValue);
try
{
int stageID = Properties.Settings.Default.StageID;
connection.Execute(sql, p, commandType: CommandType.StoredProcedure);
json = p.Get<string>("@Program");
int r = p.Get<int>("@ReturnValue");
}
When I run this, the string json is trimmed to 4000 characters.
If I use the built in .net SQL Server connection to retrieve it instead (using a query rather than a stored procedure for simplicity), the full data is correctly returned:
SqlCommand getProgram = new SqlCommand("SELECT StageProgram FROM StageProgram WHERE StageID = 1;");
getProgram.Connection = connection;
string json = Convert.ToString(getProgram.ExecuteScalar());
Is an experienced Dapper user able to provide an explanation for this behaviour?
Can it be changed?
The answers is: there is no different between nvarchar(7) and nvarchar(4000) in term of performance & storage size. There is an interesting thing is that: if you change nvarchar(7) or nvarchar(4000) to nvarchar(max). There is a difference in term of performance & storage size. Wow, Why is this happen?
nvarchar [ ( n | max ) ] n defines the string size in byte-pairs, and can be a value from 1 through 4,000. max indicates that the maximum storage size is 2^30-1 characters (2 GB). The storage size is two times n bytes + 2 bytes.
The storage size, in bytes, is two times the actual length of data entered + 2 bytes. The ISO synonyms for nvarchar are national char varying and national character varying. So if you specify nvarchar(max) you can store up to 1 billion 2-byte Unicode characters.
You must specify max of the NVARCHAR column. The size of this parameter cannot exceed 255 bytes. When you place an index on an NVARCHAR column, the maximum size is 254 bytes. You can store shorter, but not longer, character strings than the value that you specify.
4000 characters is (currently) the default length for a DBString in Dapper:
To get the full text, you just need to set the size parameter:
p.Add("@Program", "", DbType.String, direction: ParameterDirection.Output, size:int.MaxValue);
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With