Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

C# Interface behavior `IBinaryInteger<T>.GetByteCount()`

Tags:

c#

I found some weird behaviour when working with numbers in C# (.NET8 version 8.0.400).

I created a custom type to wrap numbers, this is the minimalized code to showcase the behavior:

struct Mystery<T>(T value)
    where T : IBinaryInteger<T>
{
    public T Value { get; } = value;

    public int GetByteCount() => Value.GetByteCount();
}

Execution code:

Mystery<BigInteger> mystery = new(short.MaxValue);

// Mystery<BigInteger> mystery = new(new BigInteger(short.MaxValue)); // same behavior

Console.WriteLine(mystery.GetByteCount()); // 4
Console.WriteLine(mystery.Value.GetByteCount()); // 2
Console.WriteLine(((IBinaryInteger<BigInteger>)mystery.Value).GetByteCount()); // 4

Output:

4
2
4

Since I created the Mystery type to wrap around the number, I would want mystery.GetByteCount() to behave exactly like Value.GetByteCount(). However, as seen in the output, my code is not working as intended.

Where did I go wrong? How should I modify my Mystery.GetByteCount() so that it can return the intended Value.GetByteCount()?


Here is the NUnit test to clarify the expected behavior:

[TestCase(0)] // expect 1, but was 4
[TestCase(byte.MaxValue)] // expect 2, but was 4
[TestCase(short.MaxValue)] // expect 2, but was 4
[TestCase(char.MaxValue)] // expect 3, but was 4
[TestCase(int.MaxValue)] // expect 4, PASSED
[TestCase(long.MaxValue)] // expect 8, PASSED
public void MysteryBigInt_GetByteCount(long value)
{
    Mystery<BigInteger> mystery = new(value);
    BigInteger bigInt = new(value);
    int expectedByteCount = bigInt.GetByteCount();
    Assert.Multiple(
        () =>
        {
            // PASSED
            Assert.That(mystery.Value.GetByteCount(), Is.EqualTo(expectedByteCount));

            // ony passed for int and long
            Assert.That(mystery.GetByteCount(), Is.EqualTo(expectedByteCount));
        });
}
like image 250
Nin Avatar asked Jan 21 '26 22:01

Nin


2 Answers

Since I created the Mystery type to wrap around the number, I would want mystery.GetByteCount() to behave exactly like Value.GetByteCount().

This is an incorrect expectation in general. Consider this case:

Impl x = new();
new Mystery<Impl>(x).GetByteCount(); // 1
x.GetByteCount(); // 2

interface IFoo {
    int GetByteCount();
}

struct Impl: IFoo {
    int IFoo.GetByteCount() => 1;
    public int GetByteCount() => 2;
}

struct Mystery<T>(T value)
    where T : IFoo
{
    public T Value { get; } = value;

    public int GetByteCount() => Value.GetByteCount();
}

IFoo is analogous to IBinaryInteger, and Impl is analogous to BigInteger.


When you do mystery.Value.GetByteCount(), you are calling a different GetByteCount from the one that BigInteger uses to implement IBinaryInteger.

In fact, BigInteger implements IBinaryInteger.GetByteCount via an explicit interface implementation,

int IBinaryInteger<BigInteger>.GetByteCount ();

so you will not be able to access this implementation directly on BigInteger. You must access this through the IBinaryInteger interface like in your third line with a cast to the interface.

Directly calling mystery.Value.GetByteCount() would call this other method instead.

public int GetByteCount (bool isUnsigned = false);

These two methods, though named identically, have different semantics.

The interface method does this:

Gets the number of bytes that will be written as part of TryWriteLittleEndian(Span<Byte>, Int32).

From the implementation, it seems like this will always return a multiple of 4. Presumably this is because BigInteger internally uses a uint array to represent the integer. This is consistent with the semantics of other IBinaryIntegers - ((uint)10).GetByteCount() won't return 1, just because the value is less than 256.

On the other hand, the much older, BigInteger-specific GetByteCount does this:

Gets the number of bytes that will be output by ToByteArray(Boolean, Boolean) and TryWriteBytes(Span<Byte>, Int32, Boolean, Boolean).

From the documentation of ToByteArray, we can see that this method will return the fewest number of bytes that is needed to represent the integer.

Returns the value of this BigInteger as a byte array using the fewest number of bytes possible. If the value is zero, returns an array of one byte whose element is 0x00.


At the end of the day, BigInteger just doesn't implement IBinaryInteger in the way you want, so if you want all 3 lines to be have the same way, you'd need to make an exception in your implementation, for BigInteger and whatever other type whose implementation of IBinaryInteger you "don't like".

public int GetByteCount() {
    if (Value is BigInteger bigInt) {
        return bigInt.GetByteCount();
    } else {
        return Value.GetByteCount();
    }
}
like image 57
Sweeper Avatar answered Jan 24 '26 10:01

Sweeper


You got confused - you passed indeed short (or short compatible value).

But you also have specified generic type as BigInteger, resulting your type to be Mystery<BigInteger>.

So, then invocation Value.GetByteCount() becomes BigInteger.GetByteCount() - it will use implementation for BigInteger, which return consistently 4, not 2 as you would expect from IBinaryInteger<short>.

To force correct type in BinaryInteger, you would need to adjust implementation of Mystery struct:

struct Mystery<T>(IBinaryInteger<T> value) where T : IBinaryInteger<T>
{
    public IBinaryInteger<T> Value { get; } = value;

    public int GetByteCount() => Value.GetByteCount();
}

and create like:

var mystery = new Mystery<short>(short.MaxValue);

Then it would output 2 in all your cases.

like image 25
Michał Turczyn Avatar answered Jan 24 '26 11:01

Michał Turczyn



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!