Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Binary GCD Algorithm vs. Euclid's Algorithm on modern computers

http://en.wikipedia.org/wiki/Binary_GCD_algorithm

This Wikipedia entry has a very dissatisfying implication: the Binary GCD algorithm was at one time as much as 60% more efficient than the standard Euclid Algorithm, but as late as 1998 Knuth concluded that there was only a 15% gain in efficiency on his contemporary computers.

Well another 15 years has passed... how do these two algorithms stack today with advances in hardware?

Does the Binary GCD continue to outperform the Euclidean Algorithm in low-level languages but languish behind due to its complexity in higher level languages like Java? Or is the difference moot in modern computing?

Why do I care you might ask? I just so happen to have to process like 100 billion of these today :) Here's a toast to living in an era of computing (poor Euclid).

like image 921
Jonathan Schneider Avatar asked Nov 19 '11 06:11

Jonathan Schneider


1 Answers

The answer is of course "it depends". It depends on hardware, compiler, specific implementation, whatever I forgot. On machines with slow division, binary GCD tends to outperform the Euclidean algorithm. I benchmarked it a couple of years ago on a Pentium4 in C, Java and a few other languages, overall in that benchmark, binary gcd with a 256-element lookup table beat the Euclidean algorithm by a factor of between 1.6 and nearly 3. Euclidean came closer when instead of immediately dividing, first a few rounds of subtraction were performed. I don't remember the figures, but binary still was considerably faster.

If the machine has fast division, things may be different, since the Euclidean algorithm needs fewer operations. If the difference of cost between division and subtraction/shifts is small enough, binary will be slower. Which one is better in your circumstances, you have to find out by benchmarking yourself.

like image 193
Daniel Fischer Avatar answered Sep 30 '22 02:09

Daniel Fischer