Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How many Bits represent ONE character and How many Bits represent One Byte in ASCII?

I know its simple but I still don't know it. Some people are saying that three are 7 Bits that represent a character while some are saying 8. So can anyone just tell me which one is right? If it is 8 Bits/Character then How many Bits represent a Byte? and If it's 7 then How many bits represent a Character and how many Bits represent ONE byte?

like image 416
J patel Avatar asked Oct 29 '25 19:10

J patel


1 Answers

US-ASCII is indeed 7 bits per character. The highest code has value 127, which represents the DEL control character. Any character set that has codes with higher values is not US-ASCII (but may be an extension of it, such as Unicode).

Most microprocessors work with bytes (=smallest addressable unit of storage) of eight bits. If you want to use US-ASCII with these microprocessors, you have two options:

  • Use 7 bytes (of 8 bits each) to store 8 characters (of 7 bits each), even though that makes programs very complicated.
  • Use 1 byte (of 8 bits) to store 1 character (of 7 bits), even though you'll waste space.

The need for simple programs outweighs the need for efficient memory use in this case. That's why you usually use one 8-bit unit (an octet, for short) to store a character, even though each character is encoded in only 7-bit units. You just set the extra bit to zero (or, as was done in some cases, use the extra bit for error detection).


Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!