Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Consequences of inability to add natural numbers in C

In System F I can define the genuine total addition function using Church numerals.

In Haskell I cannot define that function because of the bottom value. For example, in haskell if x + y = x, then I cannot say that y is zero - if x is bottom, x + y = x for any y. So the addition is not the true addition but an approximation to it.

In C I cannot define that function because C specification requires everything to have finite size. So in C possible approximations are even worse than in Haskell.

So we have:

In System F it's possible to define the addition but it's not possible to have a complete implementation (because there are no infinite hardware).

In Haskell it's not possible to define the addition (because of the bottom), and it's not possible to have a complete implementation.

In C it's not possible to define the total addition function (because semantic of everything is bounded) but compliant implementations are possible.

So all 3 formal systems (Haskell, System F and C) seem to have different design tradeoffs.

So what are consequences of choosing one over another?

like image 659
nponeccop Avatar asked Dec 22 '22 05:12

nponeccop


2 Answers

Haskell

This is a strange problem because you're working with a vague notion of =. _|_ = _|_ only "holds" (and even then you should really use ) at the domain semantic level. If we distinguish between information available at the domain semantic level and equality in the language itself, then it's perfectly correct to say that True ⊑ x + y == x --> True ⊑ y == 0.

It's not addition that's the problem, and it's not natural numbers that are the problem either -- the issue is simply distinguishing between equality in the language and statements about equality or information in the semantics of the language. Absent the issue of bottoms, we can typically reason about Haskell using naive equational logic. With bottoms, we can still use equational reasoning -- we just have to be more sophisticated with our equations.

A fuller and clearer exposition of the relationship between total languages and the partial languages defined by lifting them is given in the excellent paper "Fast and Loose Reasoning is Morally Correct".

C

You claim that the C requires everything (including addressable space) to have a finite size, and therefore that C semantics "impose a limit" on the size of representable naturals. Not really. The C99 standard says the following: "Any pointer type may be converted to an integer type. Except as previously specified, the result is implementation-defined. If the result cannot be represented in the integer type, the behavior is undefined. The result need not be in the range of values of any integer type." The rationale document further emphasizes that "C has now been implemented on a wide range of architectures. While some of these architectures feature uniform pointers which are the size of some integer type, maximally portable code cannot assume any necessary correspondence between different pointer types and the integer types. On some implementations, pointers can even be wider than any integer type."

As you can see, there's explicitly no assumption that pointers must be of a finite size.

like image 73
sclv Avatar answered Feb 12 '23 07:02

sclv


You have a set of theories as frameworks to do your reasoning with; finite reality, Haskell semantics, System F are just ones of them.

You can choose appropriate theory for your work, build new theory from scratch or from big pieces of existing theories gathered together. For example, you can consider set of always terminating Haskell programs and employ bottomless semantics safely. In this case your addition will be correct.

For low level language there may be considerations to plug finiteness in but for high level language it is worth to omit such things because more abstract theories allow wider application.

While programming, you use not "language specification" theory but "language specification + implementation limitations" theory so there is no difference between cases where memory limits present in language specification or in language implementation. Absence of limits become important when you start building pure theoretic constructions in framework of language semantics. For example, you may want to prove some program equivalences or language translations and find that every unneeded detail in language specification brings a much pain in proof.

like image 39
Vag Avatar answered Feb 12 '23 08:02

Vag