I was wondering how does SqlParameter
know the type when it is not specified?
For example, when instantiating a parameter object you don't need to specify the DB Type. Somehow when the command is executed .net handles this for you. Behind the scenes is there some sort of conversion taking place? Is there a cost when you do / don't specify the type explicitly?
Is there a cost when you do / don't specify the type explicitly?
Yes, and the cost can be huge. It has nothing to do with client side cast, but everything with server side execution. You should read Under the Table - How Data Access Code Affects Database Performance. Some problems that can occur are:
SqlCommand.Parameters.AddWithValue("@someparam", "somestring")
the resulted parameter is NVARCHAR
type. If your query has a clause WHERE somecolumn = @someparam
and somecolumn
is of type VARCHAR and indexed then the type mismatch will prevent the index use.SqlCommand.Parameters.AddWithValue("@someparam", "somestring")
it will result in a query that has a parameter type NVARCHAR(10)
and when you use SqlCommand.Parameters.AddWithValue("@someparam", "anotherstring")
it will result in a query that has a parameter of type NVARCHAR(13)
and this will be considered a different query, and produce a different plan. In time you can pollute the server plan cache with hundreds of plans, one for each possible length of the parameter. This can gets exacerbated when multiple parameters are present, each combination of lengths will create its own planThere are possible problems with other types too, but strings are the most notorious culprits. Again, read the linked article, is highly relevant.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With