If a function, say foo()
, is called in different ways on varius platforms, is it bad practice to use a macro?
For example:
#ifdef WIN32
#define ffoo(a) foo(0)
#else
#define ffoo(a) foo(a)
#endif
The Concept of C Macros Macros can even accept arguments and such macros are known as function-like macros. It can be useful if tokens are concatenated into code to simplify some complex declarations. Macros provide text replacement functionality at pre-processing time. The above macro (MAX_SIZE) has a value of 10.
Just type the word Call then space, then type the name of the macro to be called (run). The example below shows how to call Macro2 from Macro1. It's important to note that the two macros DO NOT run at the same time. Once the Call line is hit, Macro2 will be run completely to the end.
A macro is a name given to a block of C statements as a pre-processor directive. Being a pre-processor, the block of code is communicated to the compiler before entering into the actual coding (main () function). A macro is defined with the pre-processor directive.
In C++ it is considered bad practice, there you have so many other possibilities like inheritance, overloading etc that it is not really needed.
Creating macros using #define
has been known to cause undefined behavior. I would recommend using templates instead of macros.
Simple example from the book "Effective C++":
#define CALL_WITH_MAX(a,b) f((a) > (b) ? (a) : (b))
Produces different behavior when called like this (try it):
CALL_WITH_MAX(++a,b); // a is incremented twice
CALL_WITH_MAX(++a,b+10); // a is incremented once
If you are using C however, you are more limited as you don't have templates or object oriented workarounds.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With