Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Set decimal precision in SQL CLR function in C#

I have a SQL CLR function written in C# that goes something like:

public partial class UserDefinedFunctions
{
    [Microsoft.SqlServer.Server.SqlFunction]
    public static SqlString Decimal(decimal sum)
    {
        return sum.ToString();
    }
}

And SQL Server assumes that 'sum' is a numeric(18,0) type.

Is there a way to change the precision and scale of an input parameter in C#?

like image 805
Tomas Avatar asked Mar 28 '26 19:03

Tomas


2 Answers

The precision and scale can be specified using SQLFacet attributes. For example, [Microsoft.SqlServer.Server.SqlFacet(Precision = ..., Scale = ...)] ...

like image 180
allmhuran Avatar answered Apr 01 '26 08:04

allmhuran


While allmhuran is not incorrect regarding SqlFacet, please understand that SqlFacet, at least for the Precision and Scale properties (and at least some, but maybe not all, other properties) is used by Visual Studio / SSDT to auto-generate the DLL for publishing. Meaning, you do not need to rely on SqlFacet if you write your own CREATE statements. Using SSDT to do this is great for initial development / testing, but SqlFacet doesn't support all the options you might want to set, such as parameter defaults and/or something like WITH RETURNS NULL ON NULL INPUT (used for Scalar UDFs). For example:

  • CREATE FUNCTION [schema_name].[function_name]
    (
      @param1 NVARCHAR(MAX),
      @param2 DECIMAL(17, 9)
    )
    RETURNS NVARCHAR(MAX)
    WITH EXECUTE AS CALLER,
         RETURNS NULL ON NULL INPUT
    AS EXTERNAL NAME [assembly].[class].[method];
    
  • CREATE PROCEDURE [schema_name].[proc_name]
        @param1 NVARCHAR(4000) = N'default value',
        @param2 DECIMAL(17, 9) = 123.456789
    WITH EXECUTE AS CALLER
    AS EXTERNAL NAME [assembly].[class].[method];
    

ALSO:

  • If you are creating the objects manually, you will still need the VARBINARY literal representation of the DLL for the CREATE ASSEMBLY statement (the assumption being that Visual Studio is used to compile the assembly but not using it / SSDT to generate the publish SQL script). For this you can use a simple, open-source command line utility I wrote: BinaryFormatter

  • You should be using SqlDecimal as the input type instead of decimal. You then get the actual decimal value from the param using the Value property:

    public static SqlString Decimal([SqlFacet(Precision = ..., Scale = ..)] 
    SqlDecimal sum)
    {
       decimal theValue = sum.Value;
    }
    

For more info on working with SQLCLR in general, please visit: SQLCLR Info

like image 32
Solomon Rutzky Avatar answered Apr 01 '26 07:04

Solomon Rutzky



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!