Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why are there no octal literals in C#?

Tags:

c#

Why didn't those who develop C# include octal literals beyond hexadecimal and binary literals? What would seem to be the reason for that decision?

like image 806
Марк Павлович Avatar asked Oct 16 '25 03:10

Марк Павлович


1 Answers

Octal encoding is a relic from computing in the late 1950s and 60s. Back then it was quite useful, companies built machines with a 6-bit byte, commonly packed in an 18-bit or 36-bit word. 6-bits was enough for everybody, teletypes and printers did not yet have lowercase letters and English was the dominant language.

Octal is nice for such 6-bit bytes, takes 2 digits and all bits are used.

That did peter out, inevitably, the IBM-360 was very influential and it had an 8-bit byte. The PDP-11 of 1970 was important, an affordable 16-bit machine with an 8-bit byte. Around 1975 the olden architectures acquired dinosaur status and programmers started heavily favoring hex. Octal is clumsy to encode 8 bit bytes, hex gave us the 2 digits back. Microprocessor kits of the era all used hex.

Octal did last a lot longer than it should have. DEC manuals always used octal notation, even for the 8-bitters. The C language brought it into the curly-brace languages, narrowly, it started life on GE-635 and PDP-8, 6-bit machines. Narrowly, it didn't become real C until the PDP-11, but the seed was planted. With a way to specify octal values that was far too easy, a leading 0 on the literal.

Producing countless bugs and extremely confused programmers. The C# team thoroughly removed such common bug generators from their curly-brace language. They did a terrific job.

like image 163
Hans Passant Avatar answered Oct 17 '25 18:10

Hans Passant