When I do a fresh compilation for my project, which includes 10+ open-source libs. It takes about 40mins. (on normal hardware)
Question: where really are my bottle necks at? hard-drive seeking or CPU Ghz? I don't think multi-core would help much correct?
--Edit 1--
my normal hardware = i3 oc to 4.0Ghz, 8GB 1600Mhz DDR3 and a 2tb Western digital
--Edit 2--
my code = 10%, libs = 90%, I know I dont have to build everything everytime, but I would like to find out how to improve compiling performance, so when buying new pc for developer, I would make a smarter choice.
--Edit 3--
cc = Visual Studio (damn)
A performance bottleneck occurs when the rate at which data is accessed cannot meet specified system requirements. Bottlenecks can be categorized according to various classes of hardware, as listed in the following table of data access points within a system.
In software engineering, a bottleneck occurs when the capacity of an application or a computer system is limited by a single component, like the neck of a bottle slowing down the overall water flow.
Bottlenecks are commonly found in three areas: I/O's (i.e. database queries) memory usage. CPU usage.
You're wrong, multi-core brings a tremendous speed-up, right up until the moment your hard-drive gives up actually :)
Proof by example: distcc, which brings distributed builds (My build use about 20 cores in parallel, it's actually bound by the local preprocessing phase).
As for the real bottleneck, it's got something to do with the #include
mechanism. Languages with modules are compiled much faster...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With