Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Compile programs on multicore or distributed system

Tags:

linux

makefile

Is there any software available in linux which compiles a source code containing large number of files parallely on either multicore or distributed systems. Libraries like gcc or xserver takes very time for compilation on unicore/dual machine and most of the times it is frustrating when you need lot of recompilation. Is there any technique for compiling such source code parallely ?

like image 376
atv Avatar asked Dec 27 '09 17:12

atv


People also ask

Does compiling use multiple cores?

And for 1000 files, each core of the processor can happily compile one file at a time, keeping all cores totally busy. Tip: "make" uses multiple cores if you give it the right command line option. Without that it will compile one file after the other on a 16 core system.

How Distcc works?

distcc is designed to speed up compilation by taking advantage of unused processing power on other computers. A machine with distcc installed can send code to be compiled across the network to a computer which has the distccd daemon and a compatible compiler installed. distcc works as an agent for the compiler.


1 Answers

On distributed-memory systems, you can use distcc to farm out compile jobs to other machines. This takes a little bit of setup, but it can really speed up your build if you happen to have some extra machines around.

On shared-memory multicore systems, you can just use make -j, which will try to spawn build jobs based on the dependencies in your makefiles. You can run like this:

$ make -j

which will impose no limit on the number of jobs spawned, or you can run with an integer parameter:

$ make -j8

which will limit the number of concurrent build jobs. Here, the limit is 8 concurrent jobs. Usually you want this to be something close to the number of cores on your system.

like image 62
Todd Gamblin Avatar answered Oct 07 '22 23:10

Todd Gamblin