As a class assignment, I am to write a C program to generate all Pythagorean triples lower than a given value 't'. Here's my code, which first generates a primitive triplet (a, b, c) using Euclid's Formula, and prints all triplets of the form (ka, kb, kc) for 1 < kc < t.
for (i = 2; i < (sqrt(t) + 1); i++)
for (j = 1; j < i; j++)
if ((gcd(i,j) == 1) && ((i-j) % 2) && ((i*i + j*j) < t))
{
k = 0;
a = i * i - j * j;
b = 2 * i * j;
c = i * i + j * j;
while ((++k) * c < t)
printf("(%d, %d, %d)\n", k*a, k*b, k*c);
}
Most other algorithms that I came across use nested loops to check sum of squares, and are significantly slower than this as t grows. Is it possible to deduce a proof that it is indeed faster?
Algorithm complexity is the general method to analyze algorithmic performance. In particular, big O is commonly used to compare algorithms based on the worst case situation of each one of them.
In you case you have 4 loops:
for
that iterates thorough i
for
that iterates thorough j
gcd
while
loopIn the worst case each of these loops performs sqrt(t)
iterations. A big O complexity would be:
O(for_i) * O(for_j) * (O(gcd) + O(while))
=
O(sqrt(t)) * O(sqrt(t)) * (O(sqrt(t)) + O(sqrt(t)))
=
O(t*sqrt(t))
For the other algorithms that are slower than your method. You can apply a same reasoning to find their big O then show that this big O is greater than yours. For example the naive algorithm that checks all sums of squares will have 2 nested loops; each has at most t
iterations and therefore the big O is O(t*t) > O(t*sqrt(t))
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With