I have found a similar post on this topic but it address design aspect rather than performance so I am posting this to understand how breaking of a big c file affects compile and execution time.
I have a big utils files (all of us know quickly they grow). I am trying to understand if splitting the file into module based function files ( cookies.c, memcacheutils.c, stringutils.c, search.c, sort.c, arrayutils.c etc.) would add any penalty on compile and execution time.
My common sense says it would add some penalty as the code now has to find pointers in far fetch places rather than in the same file.
I could be horribly wrong or partially correct. Seeking guidance of all gurus. My current utils file is around 150k with 80+ functions.
Thank you for reading the post.
Generally splitting your project into multiple compilation units allows for better project management and faster partial compilation. When you edit one file you only need to recompile that compilation unit and relink in order to test & debug.
Depending on your compiler though having all in one file may allow for additional inlining and functional optimisation. All at the cost of compilation time.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With