I recently split some very large files in my c++ project down into many smaller files (basically one file per class). This has over doubled the compilation time and also enlarged the resulting executable from 1.6mb to 2.4mb. Why has this made such a huge difference?
Is this a direct result of having to include a few headers in a lot of files as opposed to just a few?
Comiler options:
g++ -Wall -Wextra -g -ggdb -std=c++0x
The executable size I am referring to is after running strip -s executable.
Sizes:
Before with debug symbols: 16MB
After with debug symbols: 26MB
Before without debug symbols: 1.5MB
After without debug symbols: 2.4MB
Additional question:
I'm already using precompiled headers by putting the headers in a pch.hpp and then using the -include pch.hpp option in my g++ flags. Is this the optimal way to do this with gcc? It seems to have very minimal impact on compile times. The only headers currently not being precompiled are apart of the project and subject to change as the project is under heavy development.
There are several reasons why this could happen, here's a braindump:
Here's some things that can help you out - keep multiple files but reduce compilation time:
cpp
files from the build but include them in a different implementation file that is compiled.This typically because you are compiling lots of system headers during every compilation unit. There is also a minor overhead associated with linking all the object files together.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With