Well, I wanted to hash a password, and i had a look at how ASP.net Identity does in the Microsoft.AspNet.Identity.Crypto
Class, and i came along this function (which is used to compare the 2 password Hashes):
[MethodImpl(MethodImplOptions.NoOptimization)]
private static bool ByteArraysEqual(byte[] a, byte[] b)
{
if (object.ReferenceEquals(a, b))
{
return true;
}
if (((a == null) || (b == null)) || (a.Length != b.Length))
{
return false;
}
bool flag = true;
for (int i = 0; i < a.Length; i++)
{
flag &= a[i] == b[i];
}
return flag;
}
This is a direct copy from the reflector output...
Now my question is, what is the NoOptimization attribute good for, and why should it be there(what would happen if i remove it)? To me, it looks like a default Equals() implementation until the for
-loop.
I tried to have a look at the IL but it is all nonsense to me :/
If a "smart" compiler turned that function into something that returns false
as soon as a mismatch was found, it could make code using this function vulnerable to a "timing attack"---an attacker could conceivably figure out where the first mismatch in a string was based on how long the function took to return.
This isn't just science fiction, actually, even though it may seem like it is. Even with the Internet in the way, you can take a whole bunch of samples and use some statistics to figure out what's going on if you have a guess that the implementation short-circuits.
There may also be cases where the JIT compiler either
I have encountered situations where an optimized method behaved differently from a non optimized. (I believe this was case A from above). Declaring just that one method as
[MethodImpl(MethodImplOptions.NoOptimization)]
solved the problem.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With