Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Detect C/C++ preprocessor abuse that leads to huge expanded code sizes

I am looking for a way to detect or mitigate C++ source that, when preprocessed, expands into huge sizes, so that GCC runs out of memory.

Sample code:

#include <iostream>
using namespace std;
int main() {
    #define A30 cout << "hello world";
    #define A29 if (1) { A30 } else { A30 }
    #define A28 if (0) { A29 } else { A29 }
    // ... you get the idea ... 
    #define A1 if (1) { A2 } else { A2 }
    #define A0 if (0) { A1 } else { A1 }
    A0
    return 0;
}

Compiling this program should generate a huge, syntactically correct if-else tree (works with smaller versions; say, up to A10); if executed, it trivially prints one of the 2^30 "hello world" strings within that tree. However, attempting compilation on an 8GB machine causes unresponsive behaviour and (after a while) the following error to be displayed:

internal compiler error: Segmentation fault
A0
^

Is it possible to limit pre-processor expansion in the above case using GCC 4.9.x, or to otherwise avoid crashing with such programs?

like image 960
tucuxi Avatar asked Apr 17 '15 13:04

tucuxi


1 Answers

As far as I know there is no way to do what you are trying to achieve with a simple gcc command. A way around it could be too add some extra steps into your build system, to see if a commit increased the code base by a certain percentage.

You have an option (-E) in gcc that outputs the code after the pre-processor step.

If you have a build at commit 1, you could save the number of lines at that commit, running gcc with -E, and doing a "wc -l" on its output.

At commit 2, before building, you do the same, and check that the number of lines hasn't increased by a threshold that you defined (10%, 20%?)

like image 114
dau_sama Avatar answered Nov 08 '22 05:11

dau_sama