Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Purging Preprocessing Macros

This is an odd problem, so I have to provide a bit of background. I have a C++ project that I'm working on that I want to clean up a bit. The main issue that I'm dealing with that makes me want to barf is the massive abuse of preprocessor macros that are used in a core component of the project. There's a file with a bunch of #defines that are commented/uncommented before compiling and using the program in order to toggle the use of different algorithms. I'd much rather have command-line arguments to do that rather than recompiling every time we want to try a different algorithm. The problem is that there are so many #ifdef's interwoven throughout the code that it seems impossible to simply refactor the code for each algorithm.

I've been told that the reasoning behind this is that this is supposed to be a real-time system that will be dealing with millisecond units of time, and the code in this component is called so many times that having an if check would adversely affect our performance. If you want to try another algorithm, you have to set the appropriate flags and recompile so that performance is optimized.

So my question for you all is this:

Is there any way that I can get rid of these macros and instead use command-line arguments without a heavy hit to performance and without reworking the logic and the code?

One of the options I was considering was trying to have compiled versions of this component for each of the possible combinations of algorithms, and then pick the version that would be defined by the combination of provided command-line arguments. But according to my friend, the number of combinations is just too many for this to be feasible. I haven't worked out the numbers myself, but I'll take his word for it considering how much work he put into this code.

like image 744
MKA Avatar asked Dec 06 '25 07:12

MKA


2 Answers

this is supposed to be a real-time system that will be dealing with millisecond units of time

That's a real problem.

[...] that having an if check would adversely affect our performance.

That's not a good reason.

If your code has been benchmarked for performance and optimized as a result (as it should have been), that would apply. I can't imagine any scenario where you would obtain a significant performance gain by replacing ifs with #defines (unless the ifs were done comparing string contents, using sequential search, or something similarly disastrous on performance).

Because of this I'm willing to bet that the decision to use macros was chosen at design time which would probably make it a case of premature optimization ("premature optimization is the root of all macro-definitions" :D)

Is there any way that I can get rid of these macros and instead use command-line arguments without a heavy hit to performance and without reworking the logic and the code?

Yes.

Here are some possible steps (there are other solutions, but this one is not using ifs at all:

  1. Define a benchmark on your code and run it (store the results)

  2. Locate one area of the code that's implemented in terms of more than one possible #defines.

  3. Move the defines behind functions with a common interface.

  4. At run-time, compare a parameter to a constant and pass a pointer to the chosen function to the client code.

    Things to avoid:

    • performing the comparison more than once; after the comparison you should have a chosen function pointer; that function pointer should be passed around, not your parameter.
    • performing the comparison using strings (or char* or anything that's not a number). Comparing strings - or any comparision not done in constant time - is disastrous for performance-critical code. Instead of comparing the parameter value using an if consider doing it using a switch statement.
    • passing large structures as parameters to your strategy functions. Passing should be done by (const) references or by pointers.
  5. call the strategy code through the function pointer instead of directly.

  6. Repeat benchmark done at step 0 and compare performance.

At this point you should have a strong case to present to your boss/manager:

  • you can make the code run as fast (adding the cost of a function call to your performance-critical code shouldn't matter much - at assembly level a function call should involve passing a few pointers on the stack and a jmp instruction - I think). You can show it runs as fast using your benchmark results.

  • you code will be easier to maintain (more modular, separating functional blocks behind interfaces, centralizing change and so on)

  • your code should be easier to extend (same reasons as above)

  • you should not have to recompile your codebase any longer.

  • you got rid of a big problem (caused by premature optimization).

  • you can continue to refactor the code base (and get rid of more macros) as development/maintenance goes on in other areas, with virtually no changes in functional behavior.

like image 129
utnapistim Avatar answered Dec 07 '25 21:12

utnapistim


Have you profiled the code in question? Assuming an if statement slows down a program sounds like premature optimization, which is a code smell.

like image 38
Sam Miller Avatar answered Dec 07 '25 20:12

Sam Miller



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!