Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is (x == x + 1) always return false for integer x?

I saw this in an interview preparation book - Algorithms for Interviews. It did not say what the answer was.

Based on my knowledge it does return false. Am I missing something ?

like image 630
user289925 Avatar asked Jul 13 '11 06:07

user289925


People also ask

Does Python equal 1 true?

Because True is equal to 1 and False is equal to 0 , adding Booleans together is a quick way to count the number of True values.

What is the integer equivalent to false in Python?

In Python 3. x True and False are keywords and will always be equal to 1 and 0 .

Is 0 considered false in Python?

Python assigns boolean values to values of other types. For numerical types like integers and floating-points, zero values are false and non-zero values are true. For strings, empty strings are false and non-empty strings are true.


2 Answers

Javascript comes to my mind, it has a special number value Infinity.

So this in fact will return true:

var x = Infinity;
alert(x == x + 1);
like image 96
kapa Avatar answered Oct 15 '22 11:10

kapa


With integers, x != x + 1 for all x. With floating-point numbers, it's not guaranteed to be true; with a large enough exponent the 1 will fade into insignificance and be lost off the end of the mantissa - the easiest case to see from being the largest possible exponent which makes the value infinity. And infinity plus one is infinity. But it would work with smaller exponents as well.

Then in languages that support operator overloading it's quite possible to break such trivial mathematical laws. For example, in Python,

>>> class X(object):
...    def __eq__(self, other):
...        return True
...    def __add__(self, other):
...        return self
...
>>> x = X()
>>> x == x + 1
True

Unless the type (and implementation) is specified in the question, it is not safe to assume that it is always true. But as it has been specified that it's an integer, you can know that x != x + 1 for all x. Integers are stored as a series of bits and adding one is done by toggling the last bit and then carrying if it had been 1 et cetera, which will never get the same result (regardless of how many bits there are in the integer type, provided it's greater than zero).

like image 23
Chris Morgan Avatar answered Oct 15 '22 11:10

Chris Morgan