I'm using SQL Server Management Studio and have the following schema in place:
CREATE TABLE tmp( id int NOT NULL IDENTITY(1,1)PRIMARY KEY, toleranceRegion DECIMAL )
Then I perform the following insertions:
INSERT INTO tmp VALUES(3.2); INSERT INTO tmp VALUES(5.678); INSERT INTO tmp VALUES(1.95);
Expected output:
id toleranceRegion -- --------------- 1 3.2 2 5.678 3 1.95
Actual output:
id toleranceRegion -- --------------- 1 3 2 6 3 2
Why are the inserted toleranceRegion values being rounded to the nearest integer?
You did not define a scale/precision for your decimal. If you want 3 digits after the decimal you should define it as DECIMAL(9,3) which would give you 6 places before the decimal and a decimal of up to 3 places. You need to analyze the expected data and based on what you expect specify the correct precision and scale for your column definition.
CREATE TABLE tmp( id int NOT NULL IDENTITY(1,1)PRIMARY KEY, toleranceRegion DECIMAL(9,3) )
See the Sql Server documentation for decimal here.
This is because you are not setting scale, which means that the system is using the default scale of zero:
s (scale) The number of decimal digits that will be stored to the right of the decimal point. This number is subtracted from p to determine the maximum number of digits to the left of the decimal point. The maximum number of decimal digits that can be stored to the right of the decimal point. Scale must be a value from 0 through p. Scale can be specified only if precision is specified. The default scale is 0. (emphasis added)
In other words, SQL Server stores zero digits to the right of decimal point.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With