What are the advantages and disadvantages of using #define over const (and vice versa)?
When I read about bad programming practices specifically magic numbers, I found myself using #define more frequently. Some questions popped into my mind such as:
Is it bad to use #define a lot?
Does it take memory space?
Would it be faster to use const instead?
I read a bit about this but I'm still not sure, from what I've understood:
#define defines a macro (not sure what macro means) and it deals with pre-processing. It replaces all instances of the defined keyword into something else before the code is being processed. const on the other hand is variable whose value cannot be changed midway during runtime.
The only reason I can think of using const is if the value relies on other variables. For example:
#define PI 3.14159f
#define RADIUS 3.0f
#define AREAOFCIRCLE PI*RADIUS*RADIUS
would be inefficient since every instance of AREAOFCIRCLE would be replaced by PI*RADIUS*RADIUS so the program would compute for it everytime you use AREAOFCIRCLE. On the other hand:
const float areaofcircle = PI*RADIUS*RADIUS;
would be more effecient since the program would only compute for it once.
So back to the original question, how does #define compare to const?
Don't worry about efficiency in this case since all of them will be computed in compile-time.
You should stop using Macros (at least to define constants) whenever you can. Macros are wild things against namespaces and scopes. On the other hand const
objects have type and this can reduce unintended mistakes.
It's always useful to read Stroustrup's piece of advises: "So, what's wrong with using macros?"
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With