Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What are the differences between O(1) and O(2) in algorithm-analysis?

According to the definition of big O f(n) <= C*g(n)(which means f(n) = O(g(n)), it could be deduced that:

f(n) <= C
f(n) <= 2C

I think there are no big differences between these two. What I could come up with is:

f(n) = 1 - 1 / n
f(n) = 2 - 1 / n
C = 1

But what differs this two complexities,since both are constant complexity?

Could you show some real world code to demonstrate the differences between O(1) and O(2).

like image 587
Jichao Avatar asked Jan 04 '10 02:01

Jichao


People also ask

What is an O 1 algorithm?

O(1) describes algorithms that take the same amount of time to compute regardless of the input size. For instance, if a function takes the same time to process ten elements and 1 million items, then we say that it has a constant growth rate or O(1) .

What does o1 mean in statistics?

The notation o(1) means ``a function that converges to 0. '' This means that there is some input size past which the function is always between -0.1 and 0.1; there is some input size past which the function is always between -0.01 and 0.01; and so on.

What does O 1 complexity mean?

An algorithm has constant time complexity if it takes the same time regardless of the number of inputs. ( Reading time: under 1 minute) If an algorithm's time complexity is constant, it means that it will always run in the same amount of time, no matter the input size.

What is the difference between O 1 and O N?

In short, O(1) means that it takes a constant time, like 14 nanoseconds, or three minutes no matter the amount of data in the set. O(n) means it takes an amount of time linear with the size of the set, so a set twice the size will take twice the time.


5 Answers

There is no difference between O(1) and O(2). Algorithms classifying as O(1) are O(2) and vice versa. In fact, O(c1) is O(c2) for any positive constants c1 and c2.

O(c) where c is a positive constants simply means that the runtime is bounded independent of the input or problem size. From this it is clear (informally) that O(1) and O(2) are equal.

Formally, consider a function f in O(1). Then there is a constant c such that f(n) <= c * 1 for all n. Let d = c / 2. Then f(n) <= c = (c / 2) * 2 = d * 2 which shows that f is O(2). Similarly if g is O(2) there is a constant c such that g(n) <= c * 2 for all n. Let d = 2 * c. Then g(n) <= c * 2 = d = d * 1 which shows that g is O(1). Therefore O(1) = O(2).

like image 53
jason Avatar answered Nov 11 '22 13:11

jason


O(1) and O(2) are the same, as is any O(constant value).

The point being that neither rely on some function of N.

like image 22
Mitch Wheat Avatar answered Nov 11 '22 12:11

Mitch Wheat


There is no difference.

In the graph below, the red line represents O(n) and the green curve represents O(n2).

As you can see by the red line, the 2 and the 1 become insignificant as x increases (the green curve grows at a much faster rate). This is what Big-O notation is trying to capture; constants are relatively meaningless.

like image 23
mpen Avatar answered Nov 11 '22 12:11

mpen


Maybe they meant that both algorithms execute in constant time regardless of input size (usually denoted as N), but one of them is twice as fast. But it's an abuse of the big-O notation.

like image 2
Seva Alekseyev Avatar answered Nov 11 '22 13:11

Seva Alekseyev


There is no difference between O(1) and O(2).

The order-of notation is unique up to a constant. O(f(x)) means that there is some constant k such that the time would be less than kf(x).

If something is O(2), then there is some constant k that the program takes less than 2k. Therefore, there's another constant, k' = 2k that works for O(1).

like image 1
Chip Uni Avatar answered Nov 11 '22 11:11

Chip Uni