Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is there difference in performance between not taking a return value and taking and discarding it? [duplicate]

Tags:

c#

Given the initial code:

  callSomeFunction(someParameter);

Currently, the function returns but the value is not used. However, we need to provide a trace log, so we revise the code as thus:

#if DEBUG
    Debug.Print $"Entering function with {nameof(someParameter)}: {someParameter}";
#endif
   var result = callSomeFunction(someParameter);
#if DEBUG
   Debug.Print $"Leaving function with result: {result}";
#endif

In a release build, the code is basically equivalent to:

   var result = callSomeFunction(someParameter);

and obviously the result is not used. So with that changes, are there any performance ramifications just because it's now taking a return value when it originally wasn't? The original reasoning for not just making 2 different blocks was that doing the latter could be error prone since there's now 2 call sites to maintain whereas the version above keeps only 1 call site to maintain.

like image 987
this Avatar asked Nov 07 '22 15:11

this


1 Answers

It makes no difference if you assign the return value to a variable or not. At least with Release builds, the compiler will notice that you don't use result anywhere else and will optimize it away.

And even in Debug builds, at least no memory allocations (and thus no GC) will occur, because result will be assigned on the stack, since it's a local variable.

like image 88
Heinz Kessler Avatar answered Nov 14 '22 23:11

Heinz Kessler