I'm having an extremely difficult time understand BitSet.valueOf(bytearray)
I have the following code:
byte[] a = new byte[]{(byte) 0x2D, (byte) 0x04};
//binary => 0010 1101 0000 0100
BitSet bs = BitSet.valueOf(a);
System.out.println(bs);
Code above gives me an output of {0, 2, 3, 5, 10}
. Why?
I thought it was supposed to return the indices that are true, or holds 1, backwards which then should be {2, 8, 10, 11, 13}
.
As you would expect, BitSet
is doing the right thing. You seem to be misinterpreting which bit is zero and which one is seven. For the first byte, your binary representation is correct, but remember that the first bit is on the right (lowest to highest, as a weird artifact of how we write numbers):
Bit Value: 0 0 1 0 1 1 0 1
Index: 7 6 5 4 3 2 1 0
Reading off the indices gives 0, 2, 3, 5, 10
It is the first byte from LSB to MSB followed by the second byte from LSB to MSB:
1011 0100 0010 0000
| || | |
0 23 5 10
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With