Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

ANSI C #define VS functions

I have an question about performance of my code. Let's say I have a struct in C for a point:

typedef struct _CPoint
{
    float x, y;
} CPoint;

and a function where I use the struct.

float distance(CPoint p1, CPoint p2)
{
    return sqrt(pow((p2.x-p1.x),2)+pow((p2.y-p1.y),2));
}

I was wondering if it would be a smart idea to replace this function for a #define,

#define distance(p1, p2)(sqrt(pow((p2.x-p1.x),2)+pow((p2.y-p1.y),2)));

I think it will be faster because there will be no function overhead, and I'm wondering if I should use this approach for all other functions in my program to increase the performance. So my question is:

Should I replace all my functions with #define to increase the performance of my code?

like image 449
Marnix v. R. Avatar asked Dec 02 '22 00:12

Marnix v. R.


1 Answers

No. You should never make the decision between a macro and a function based on a perceived performance difference. You should evaluate it soley based on the merits of functions over macros. In general choose functions.

Macros have a lot of hidden downsides that can bite you. Case in point, your translation to a macro here is incorrect (or at least not semantics preserving with the original function). The argument to the macro distance gets evaluated 2 times each. Imagine I made the following call

distance(GetPointA(), GetPointB());

In the macro version this actually results in 4 function calls because each argument is evaluated twice. Had distance been left as a function it would only result in 3 function calls (distance and each argument). Note: I'm ignoring the impact of sqrt and pow in the above calculations as they're the same in both versions.

like image 104
JaredPar Avatar answered Dec 17 '22 05:12

JaredPar