is the time complexity of this program O(1)?
f1(int n){
int temp = n;
while (temp /= 2))
{
len++; result+=m;
}
}
and if we change int temp to double temp, does the time complexity change as well, or it will remain constant?
f2(int n){
double temp = (double)n;
while (temp /= 2))
{
len++; result+=m;
}
}
The answer for the integer part is O(log n)
because the value is halved each time.
The double version starts the same way, except that when the value reaches 1 or close to 1, it doesn't stop and divides until underflow makes it 0. At this point, the number of divisions is fixed.
I've made a small empirically calibrated program which tries to predict the number of loops:
#include <stdio.h>
#include <math.h>
void f2(int n){
int len=0;
double temp = (double)n;
while (temp /= 2)
{
len++;
}
// 1.53 is an empiric constant, maybe it could be theorically computed
// it was just to make the numbers match
printf("%d %lf\n",len,log(n)*1.53+1074);
}
int main()
{
f2(100000000);
f2(10000000);
f2(1000000);
f2(10000);
f2(100);
f2(1);
}
I get:
1101 1102.183642
1097 1098.660686
1094 1095.137731
1087 1088.091821
1081 1081.045910
1074 1074.000000
So the complexity is O(log n)
plus an incompressible number of iterations, depending on the machine.
(my apologies for the empiric aspect of my answer, I'm not a floating point expert)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With