Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

C# unit testing number precision questions

I am testing basic math functions that'll return its mean/variance/standard deviation. The problem I am facing is that I cannot get the precision of "expected value" to math what is returned by the function. For example, if the variance function returns 50.5555555555566, even if I set the expected value explicitly to 50.5555555555566, it'll say they're two different doubles and the unit test fails.

Below is the actual output from the unit test:

Assert.AreEqual failed. Expected:<50.5555555555556>. Actual:<50.5555555555566>.

Can anyone advise on a way around this? I am using the built-in visual studio unit testing suite. Thanks.

like image 946
thomas1234 Avatar asked Jan 21 '23 05:01

thomas1234


1 Answers

Floating point (Single/Double)numbers need to be tested with a tolerance value. So you can say if the two numbers are within 0.0001 (tolerance) of each other, consider them equal

In NUnit, you have comparison asserts. e.g. the following overload of AreEqual, find the equivalent one for MSTest..

Assert.AreEqual( double expected, double actual, double tolerance,
                 string message );

Update: This should be the method you need in MSTest. Try and see if it resolves your issue..

like image 146
Gishu Avatar answered Jan 23 '23 19:01

Gishu