I've often seen that the discrete logarithm is a hard problem. However, I don't quite see how this could be. It seems to me that a regular binary search would do just fine to serve this purpose. For example,
binary_search(base, left, right, target) {
if (pow(base, left) == target)
return left;
if (pow(base, right) == target)
return right;
if (pow(base, (left + right) / 2) < target)
return binary_search(base, (left + right) / 2, right, target);
else
return binary_search(base, left, (left + right) / 2, target);
}
log(base, number) {
left = 1;
right = 2;
while(pow(base, p) < number) {
left = right;
right *= 2;
}
return binary_search(base, left, right, number);
}
If the naive implementation of just incrementing p
until pow(base, p)
is O(n), then surely this binary search is O(log(n) ^2).
Or do I not understand how this algorithm is measured?
Edit: I don't usually write binary searches, so if there's some trivial implementation error, kindly just ignore it or edit in a fix.
Your algorithm assumes that a < b implies pow(base, a) < pow(base, b).
This is true for natural numbers, but it won't work in a finite cyclic group (when 'pow' is calculated modulo some number).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With