Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Does the gcc -g debugging flag affect program execution?

I've just been testing a program I'm working, and I see that it's executing 3μs faster (a statistically significant change) when I compile it with -g. This makes no sense to me - I thought that the -g flag wasn't supposed to affect the program execution, and that even if it did it would make it run slower, not faster.

Can anyone tell me why this is happening? And whether it changes the programs execution flow? I am not compiling with -O because I need it to execute exactly as written, but if -g can somehow make it run faster with changing the instruction order I should obviously be using that.

So I need to know exactly what changes the -g flag makes to the program.

Edit: The more tests I run, the bigger the t-value gets (= the more statistically significant the difference becomes). This is definitely not measurement error - something is going on.

like image 878
Benubird Avatar asked Feb 03 '11 09:02

Benubird


People also ask

What does GCC G flag do?

gcc -g generates debug information to be used by GDB debugger.

What are debugging flags?

Special flags in system tables, which dwell in process memory and which an operation system sets, can be used to indicate that the process is being debugged. The states of these flags can be verified either by using specific API functions or examining the system tables in memory.

Which GCC flag is used to enable?

gcc -Wall enables all compiler's warning messages. This option should always be used, in order to generate better code.

Does GCC optimize by default?

GCC has a range of optimization levels, plus individual options to enable or disable particular optimizations. The overall compiler optimization level is controlled by the command line option -On, where n is the required optimization level, as follows: -O0 . (default).


2 Answers

As others have said, debugging symbols will not change the control flow of your code unless that there is an (unlikely) bug in the compiler.

It changes execution, though, because the executable becomes bigger, and the executed code is spread more widely on more pages. You can expect more cache misses and IO signals. On a multi-tasking environment (and even a Linux/busybox system is such a thing) this can result is slightly different scheduling behavior.

On the other hand, measuring such tiny time differences as you describe them is an art in its own rights. You are probably in an Heisenberg setting, where your measurements influence execution times. Your measurements may show statistically significant deviation, but I would be extremely careful in interpreting them as saying such and such option makes faster code.

like image 174
Jens Gustedt Avatar answered Oct 22 '22 13:10

Jens Gustedt


The -g flag makes 0 changes to the actual generated code. What it does is add debug sections to the executable. Those sections are not loaded at runtime, but debuggers can load them. As the executable now is a bit different it's larger - you might try to measure the no. of page faults going on with one version vs another., there will be changes in how the executable is stored on disk, but no code changes.

If you want to see the assembly, run objdump -d on your binary and compare

I do question the validity of the 3us increase though, reliably measuring 3us, at least on a general purpose OS is a hard task - I hope you have run your program a few thousand times (likely a few hundred thousand times rather) to come up with that number to try to eliminate all the random things affecting such a measurement.

like image 23
nos Avatar answered Oct 22 '22 13:10

nos