Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is there any difference between DECIMAL and NUMERIC in SQL Server?

People also ask

What is difference between decimal and numeric in SQL Server?

NUMERIC determines the exact precision and scale. DECIMAL specifies only the exact scale; the precision is equal or greater than what is specified by the coder. DECIMAL columns can have a larger-than-specified precision if this is more convenient or efficient for the database system.

Is numeric as decimal SQL?

In Transact-SQL, numeric is functionally equivalent to the decimal data type. Use the decimal data type to store numbers with decimals when the data values must be stored exactly as specified. The behavior of float and real follows the IEEE 754 specification on approximate numeric data types.

What is difference between INT and decimal in SQL?

According to the MSDN library, the storage space of the DECIMAL(5,0) datatype is, just like the DECIMAL(9,0) datatype, 5 bytes. INT is 1 byte smaller, but can store everything in the range of -2^31 to 2^31 instead of the -99,999 to 99,999 which DECIMAL(5,0) can store.

Is a decimal a numeric value?

In contrast, integer and decimal data types are exact numeric values.


They are the same. Numeric is functionally equivalent to decimal.

MSDN: decimal and numeric


This is what then SQL2003 standard (§6.1 Data Types) says about the two:

 <exact numeric type> ::=
    NUMERIC [ <left paren> <precision> [ <comma> <scale> ] <right paren> ]
  | DECIMAL [ <left paren> <precision> [ <comma> <scale> ] <right paren> ]
  | DEC [ <left paren> <precision> [ <comma> <scale> ] <right paren> ]
  | SMALLINT
  | INTEGER
  | INT
  | BIGINT

 ...

21) NUMERIC specifies the data type
    exact numeric, with the decimal
    precision and scale specified by the
    <precision> and <scale>.

22) DECIMAL specifies the data type
    exact numeric, with the decimal scale
    specified by the <scale> and the
    implementation-defined decimal
    precision equal to or greater than the
    value of the specified <precision>.

To my knowledge there is no difference between NUMERIC and DECIMAL data types. They are synonymous to each other and either one can be used. DECIMAL and NUMERIC data types are numeric data types with fixed precision and scale.

Edit:

Speaking to a few collegues maybe its has something to do with DECIMAL being the ANSI SQL standard and NUMERIC being one Mircosoft prefers as its more commonly found in programming languages. ...Maybe ;)


Joakim Backman's answer is specific, but this may bring additional clarity to it.

There is a minor difference. As per SQL For Dummies, 8th Edition (2013):

The DECIMAL data type is similar to NUMERIC. ... The difference is that your implementation may specify a precision greater than what you specify — if so, the implementation uses the greater precision. If you do not specify precision or scale, the implementation uses default values, as it does with the NUMERIC type.

It seems that the difference on some implementations of SQL is in data integrity. DECIMAL allows overflow from what is defined based on some system defaults, where as NUMERIC does not.