Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is there any way to tell 64-bit GHC to treat Int as Int32?

Tags:

haskell

ghc

I'm writing a Haskell to Javascript code generator, using GHC as a library. Since Javascript doesn't have an integer type and its Number type can only represent integers up to 2⁵³ properly, I'm representing integers as Numbers, explicitly performing all arithmetic modulo 2³². This works very well with a 32 bit GHC, but rather worse with the 64 bit version.

GHC will happily coerce Int64 values to Ints and interpret Int constants as 64 bit values (0xffffffff turns into 4294967295 rather than -1, for example) and that's causing all sorts of annoying problems.

The compiler works very well for "normal" web stuff even on a 64 bit system, provided that the standard libraries are built on a 32 bit machine, but "please don't use large-ish numbers, OK?" isn't something you want to see in your compiler's manual. Some of the problems (but not all) can be alleviated by compiling with -O0, but that (unsurprisingly) produces code that's not only slow, but also way too big.

So, I need to stop GHC from assuming that Int and Int64 are equivalent. Is this even possible?

like image 883
valderman Avatar asked Dec 21 '22 22:12

valderman


1 Answers

That is not possible, without using a 32 bit GHC.

The Haskell Language standard says that the only thing you know about the Int type is that it has

at least the range [-2^29 .. 2^29-1

So you can happily truncate Int values larger than this, and still be a fully compliant Haskell 2010 implementation!

However, you should probably not do this, and instead look for a 64 bit integer type for JavaScript. The same trick as e.g. GHC does to support Int64 on 32 bit machines.

like image 163
Don Stewart Avatar answered Feb 23 '23 22:02

Don Stewart