Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Typecasting to 'int' in Python generating wrong result

I tried performing following typecast operation in Python 3.3

int( 10**23 / 10 )

Output: 10000000000000000000000

And after increasing power by one or further

int( 10**24 / 10 )

Output: 99999999999999991611392

int( 10**25 / 10 )

Output: 999999999999999983222784

Why is this happening? Although a simple typecasting like

int( 10**24 )

Output: 1000000000000000000000000

is not affecting the values.

like image 369
SMagic Avatar asked Mar 15 '23 04:03

SMagic


2 Answers

You are doing floating-point division with the / operator. 10**24/10 happens to have an inexact integer representation.

If you need an integer result, divide with //.

>>> type(10**24/10)
<class 'float'>
>>> type(10**24//10)
<class 'int'>
like image 165
Simon Avatar answered Mar 24 '23 19:03

Simon


In Python 3.x, / always does true(floating point) division. Using floor division // instead could give you the expected result.

>>> int(10**25 // 10)
1000000000000000000000000

The reason of this behavior is that float can't store big integers precisely.

Assuming IEEE-754 double precision is used, it can store integers at most 253 precisely, which is approximitely 1016. Another example:

>>> int(10**17 / 10 + 1)
10000000000000000
like image 20
Yu Hao Avatar answered Mar 24 '23 19:03

Yu Hao