Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

System where 1 byte != 8 bit? [duplicate]

All the time I read sentences like

don't rely on 1 byte being 8 bit in size

use CHAR_BIT instead of 8 as a constant to convert between bits and bytes

et cetera. What real life systems are there today, where this holds true? (I'm not sure if there are differences between C and C++ regarding this, or if it's actually language agnostic. Please retag if neccessary.)

like image 248
Xeo Avatar asked Apr 01 '11 16:04

Xeo


People also ask

Are 8 bits are equal to 1 byte?

On almost all modern computers, a byte is equal to 8 bits. Large amounts of memory are indicated in terms of kilobytes, megabytes, and gigabytes.

Is a byte always 8 bits C++?

Unless you're writing code that could be useful on a DSP, you're completely entitled to assume bytes are 8 bits.

Why are there 8 bits in a byte and not 10?

The byte was originally the smallest number of bits that could hold a single character (I assume standard ASCII). We still use ASCII standard, so 8 bits per character is still relevant. This sentence, for instance, is 41 bytes. That's easily countable and practical for our purposes.


1 Answers

On older machines, codes smaller than 8 bits were fairly common, but most of those have been dead and gone for years now.

C and C++ have mandated a minimum of 8 bits for char, at least as far back as the C89 standard. [Edit: For example, C90, §5.2.4.2.1 requires CHAR_BIT >= 8 and UCHAR_MAX >= 255. C89 uses a different section number (I believe that would be §2.2.4.2.1) but identical content]. They treat "char" and "byte" as essentially synonymous [Edit: for example, CHAR_BIT is described as: "number of bits for the smallest object that is not a bitfield (byte)".]

There are, however, current machines (mostly DSPs) where the smallest type is larger than 8 bits -- a minimum of 12, 14, or even 16 bits is fairly common. Windows CE does roughly the same: its smallest type (at least with Microsoft's compiler) is 16 bits. They do not, however, treat a char as 16 bits -- instead they take the (non-conforming) approach of simply not supporting a type named char at all.

like image 165
Jerry Coffin Avatar answered Oct 04 '22 18:10

Jerry Coffin