I have a section of T-SQL code that looks something like this:
select @automation_rate =
case
when @total_count = 0 then 0
else @automated_count / @total_count
end
@automation_rate is decimal(3,2). @total_count and @automated_count are integers.
Unfortunately the only values ever returned for @automation_rate are either 0 or 1. Clearly there is something wrong here, and it may be ridiculously simple, but for the life of me I can't see it.
All data in the underlying resultset (I'm looping over a table) is either 0 or a positive integer. All values are integers except for the automation rate.
Here's example values and expected (using a calculator) vs actual results:
automated count total count expected ratio actual ratio
--------------- ----------- -------------- ------------
0 35 0.0 0.00
98 258 37.98 0.00
74 557 13.29 0.00
140 140 1.00 1.00
As you can see I get a ratio of 0.00 for all values except where automated = total. I also have an Excel spreadsheet that does the same basic calculation and it comes out perfect (i.e. just like the "expected" column) every time.
So where did I go wrong?
(This is on MS SQL Server 2005, on the off chance that has any impact at all)
Edit Thanks to everyone for the answers. I blew the integer rounding part by assuming since it was moving into a decimal data type that it would automatically convert, instead of realizing it would do the calculation, round, and then convert. Everyone had similar answers so upvotes all around.
This seems to be due to integer math, since @automated_count and @total_count are obviously integers. You need to say:
1.0*@automated_count / @total_count
Or more explicitly:
CONVERT(DECIMAL(5,2), @automated_count) / @total_count
Also these will yield 0.3798 etc. so you might want:
CONVERT(DECIMAL(5,2), 100.0*@automated_count / @total_count)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With