Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to make a #define use 64 bits

Tags:

c

Suppose I have the following:

#define MAX (16 * 1024 * 1024 * 1024)
#define MIN (1 * 1024 * 1024 * 1024)

This will give MAX = 0. I assume that this is because the define is only using 32 bits for the define. Is there a way to use 64 bits for this or do I need to rework my code so that the define can handle a smaller value?

like image 610
TreeWater Avatar asked Dec 14 '22 09:12

TreeWater


2 Answers

This will give MAX = 0

No, this will replace MAX with the literal tokens ( 16 * 1024 * 1024 * 1024 ) during the preprocessing phase.

I assume that this is because the define is only using 32 bits for the define

The define isn't using any bits, it's just a text substitution.

Is there a way to use 64 bits for this

Using the type explicitly is perhaps nicer than using the integer literal suffix, because it's more explicit about exactly how many bits you get:

#define MAX ((uint64_t)16 * 1024 * 1024 * 1024)

or

#define MAX (16ll * 1024 * 1024 * 1024)
like image 192
Useless Avatar answered Dec 15 '22 22:12

Useless


The reason this is happening is that all of those constants are implicitly of type int. In your case, that appears to be a 32-bit type. You need to make sure you're working with a 64-bit type if that's the behaviour you want to have.

You can typecast it to make sure it's a 64-bit type:

#define MAX ((int64_t)16 * 1024 * 1024 * 1024)

Or just expand the math yourself:

#define MAX 17179869184
like image 41
Carl Norum Avatar answered Dec 15 '22 22:12

Carl Norum