In programming, when we say "the 7th Least Significant Bit", do we have a standard of whether it is bit 7 or bit 6 (if we start from bit 0).
Because if we say "the 2nd Least Significant Bit", it sounds like it is bit 1 (counting from bit 0 again), so if 2nd means bit 1, then 7th means bit 6, not bit 7.
A standard? Like an ISO standard? No, although quite a few of them actually count bits at b0. But, in English terms, the second least significant bit is one removed from the (first) least significant bit, so that would be b1.
So the seventh would be b6. In an octet, the most significant bit, b7, would be the eighth least signicant bit.
For what it's worth, I don't think I've ever heard the phrase, "the 7th least significant bit", in my entire 30-odd year worklife. It's always been bN (where N rages from 0 to the number of bits minus one) or just the least or most significant bit (not even second most significant).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With