Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why do I get different results using a virtual or non-virtual property?

Tags:

c#

.net

The code below, when running a release configuration on .NET 4.5, produces the following output...

Without virtual: 0.333333333333333
With virtual:    0.333333343267441

(When running in debug both versions give 0.333333343267441 as the result.)

I can see that dividing a float by a short and returning it in a double is likely to produce garbage after a certain a point.

My question is: Can anyone explain why the results are different when the property providing the short in the denominator is virtual or non-virtual?

public class ProvideThreeVirtually
{
    public virtual short Three { get { return 3; } }
}

public class GetThreeVirtually
{
    public double OneThird(ProvideThreeVirtually provideThree)
    {
        return 1.0f / provideThree.Three;
    }
}

public class ProvideThree
{
    public short Three { get { return 3; } }
}

public class GetThree
{
    public double OneThird(ProvideThree provideThree)
    {
        return 1.0f / provideThree.Three;
    }
}

class Program
{
    static void Main()
    {
        var getThree = new GetThree();
        var result = getThree.OneThird(new ProvideThree());

        Console.WriteLine("Without virtual: {0}", result);

        var getThreeVirtually = new GetThreeVirtually();
        var resultV = getThreeVirtually.OneThird(new ProvideThreeVirtually());

        Console.WriteLine("With virtual:    {0}", resultV);
    }
}
like image 839
IanR Avatar asked Aug 19 '14 10:08

IanR


1 Answers

I believe James' conjecture is correct and this is a JIT optimization. The JIT is performing less precise division when it can which results in the difference. The following code sample duplicates your results when compiled in Release mode with x64 target and executed directly from a command prompt. I'm using Visual Studio 2008 with NET 3.5.

    public static void Main()
    {
        double result = 1.0f / new ProvideThree().Three;
        double resultVirtual = 1.0f / new ProvideVirtualThree().Three;
        double resultConstant = 1.0f / 3;
        short parsedThree = short.Parse("3");
        double resultParsed = 1.0f / parsedThree;

        Console.WriteLine("Result of 1.0f / ProvideThree = {0}", result);
        Console.WriteLine("Result of 1.0f / ProvideVirtualThree = {0}", resultVirtual);
        Console.WriteLine("Result of 1.0f / 3 = {0}", resultConstant);
        Console.WriteLine("Result of 1.0f / parsedThree = {0}", resultParsed);

        Console.ReadLine();
    }

    public class ProvideThree
    {
        public short Three
        {
            get { return 3; }
        }
    }

    public class ProvideVirtualThree
    {
        public virtual short Three
        {
            get { return 3; }
        }
    }

The results is as follows:

Result of 1.0f / ProvideThree = 0.333333333333333
Result of 1.0f / ProvideVirtualThree = 0.333333343267441
Result of 1.0f / 3 = 0.333333333333333
Result of 1.0f / parsedThree = 0.333333343267441

The IL is fairly straightforward:

.locals init ([0] float64 result,
           [1] float64 resultVirtual,
           [2] float64 resultConstant,
           [3] int16 parsedThree,
           [4] float64 resultParsed)
IL_0000:  ldc.r4     1.    // push 1 onto stack as 32-bit float    
IL_0005:  newobj     instance void Romeo.Program/ProvideThree::.ctor()
IL_000a:  call       instance int16 Romeo.Program/ProvideThree::get_Three()
IL_000f:  conv.r4          // convert result of method to 32-bit float 
IL_0010:  div          
IL_0011:  conv.r8          // convert result of division to 64-bit float (double)
IL_0012:  stloc.0
IL_0013:  ldc.r4     1.    // push 1 onto stack as 32-bit float
IL_0018:  newobj     instance void Romeo.Program/ProvideVirtualThree::.ctor()
IL_001d:  callvirt   instance int16 Romeo.Program/ProvideVirtualThree::get_Three()
IL_0022:  conv.r4          // convert result of method to 32-bit float 
IL_0023:  div
IL_0024:  conv.r8          // convert result of division to 64-bit float (double)
IL_0025:  stloc.1
IL_0026:  ldc.r8     0.33333333333333331    // constant folding
IL_002f:  stloc.2
IL_0030:  ldstr      "3"
IL_0035:  call       int16 [mscorlib]System.Int16::Parse(string)
IL_003a:  stloc.3          // store result of parse in parsedThree
IL_003b:  ldc.r4     1.
IL_0040:  ldloc.3      
IL_0041:  conv.r4          // convert result of parse to 32-bit float
IL_0042:  div
IL_0043:  conv.r8          // convert result of division to 64-bit float (double)
IL_0044:  stloc.s    resultParsed

The first two cases are nearly identical. The IL first pushes the 1 onto the stack as a 32-bit float, obtains 3 from one of the two methods, converts the 3 to a 32-bit float, performs the division, and then converts the result to a 64-bit float (double). The fact that (nearly) identical IL--the only difference is the callvirt vs. the call instruction--causes different results points squarely at the JIT.

In the third case the compiler has already performed the division into a constant. The div IL instruction isn't executed for this case.

In the final case I use a Parse operation to minimize the chance of the statement getting optimized (I'd say "prevent" but I don't know enough about what the compiler is doing). The result for this case matches the result from the virtual call. It appears that the JIT is either optimizing away the non-virtual method or it is performing division in a different manner.

Interestingly, if you eliminate the parsedThree variable and simply call the following for the fourth case, resultParsed = 1.0f / short.Parse("3"), the result is the same as the first case. Again, it appears the JIT is executing the division differently when it can.

like image 141
Mike Cowan Avatar answered Oct 25 '22 07:10

Mike Cowan