So I often see something like this:
#define gf_PI f32(3.14159265358979323846264338327950288419716939937510)
#define gf_PIhalf f32(3.14159265358979323846264338327950288419716939937510 * 0.5)
This means that half PI value is calculated every time I use gf_PIhalf in my code, right?
Wouldn't it be better to literally write the value of half PI instead?
Wouldn't it be even better to do the following:
#define gf_PI f32(3.14159265358979323846264338327950288419716939937510)
const float gf_PIHalf = gf_PI * 0.5f; // PIHalf is calculated once
Finally wouldn't it be best to do it like this (and why it doesn't seem to be a common practice):
const float gf_PI = 3.14159265358979323846264338327950288419716939937510;
const float gf_PIHalf = gf_PI * 0.5f;
This means that half PI value is calculated every time I use gf_PIhalf in my code, right?
Nope, not likely.
You can reasonably count on your compiler to do that multiplication at compile time, not runtime.
Your conclusions are somewhat right, except that the #define
version will almost definitely resolve in compile time and the bit about types const globals being uncommon practice. They are common practice in modern good code. #define
s are all but dead for this use. The best practice is to define your file scope globals in an unnamed namespace:
namespace
{
const float g_SomeGlobal = 123.456f;
}
This prevents anyone outside of your translation unit from being able to 'see' g_SomeGlobal
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With