Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why is 1 Byte equal to 8 Bits? [closed]

Why not 4 bits, or 16 bits?

I assume some hardware-related reasons and I'd like to know how 8bit 1byte became the standard.

like image 660
aerin Avatar asked Mar 16 '17 18:03

aerin


1 Answers

I'ts been a minute since I took computer organization, but the relevant wiki on 'Byte' gives some context.

The byte was originally the smallest number of bits that could hold a single character (I assume standard ASCII). We still use ASCII standard, so 8 bits per character is still relevant. This sentence, for instance, is 41 bytes. That's easily countable and practical for our purposes.

If we had only 4 bits, there would only be 16 (2^4) possible characters, unless we used 2 bytes to represent a single character, which is more inefficient computationally. If we had 16 bits in a byte, we would have a whole lot more 'dead space' in our instruction set, we would allow 65,536 (2^16) possible characters, which would make computers run less efficiently when performing byte-level instructions, especially since our character set is much smaller.

Additionally, a byte can represent 2 nibbles. Each nibble is 4 bits, which is the smallest number of bits that can encode any numeric digit from 0 to 9 (10 different digits).

like image 162
Bango Avatar answered Oct 15 '22 08:10

Bango