Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is it possible that time complexity of any algorithm decrease as the input size increase, any example

I just read in Cormen's algorithm book that big-O and big-omega do not follow the trichotomy property. That means for two functions, f(n) and g(n), it may be the case that neither f(n) = O(g(n)) nor f(n) = Omega(g(n)) holds. In example they argue that if function is n^(1+sin n) than it is possible.

While it is correct is it possible in a real world algorithm to have a run time of something like sin n. Since it would sometimes decrease, with the increase of input size. Does anybody knows any such algorithm or can give a small code snippet which does this.

Thanks for the answers, so in that case is it correct to assume that Given a problem P with size n, if it can not be solved in O(f(n)) time by any known algorithm, then the lower bound of P is Omega(f(n)).

like image 630
ocwirk Avatar asked Sep 17 '11 21:09

ocwirk


People also ask

Does input size affect time complexity?

The time complexity does not depend on the input size, i.e., regardless of the input size, the algorithm will have the same runtime. For example, the following situation has O(1) time complexity: A loop or recursion that runs a constant number of times.

How an algorithm will perform when the input grows larger?

To visualize how our code scales up, we will use Big O notation, which is a notation that shows the limiting behavior of a function when the argument tends towards a particular value or infinity. In essence, it will show us how our algorithm performs when its input size gets larger.

Which algorithm complexity is not dependent on the size of a problem?

Any constant time algorithm (hashing, array lookup and adding to or removing from the front of a List are examples) do not depend on the size of inputs.

What is time complexity of an algorithm explain with example?

When we analyse an algorithm, we use a notation to represent its time complexity and that notation is Big O notation. For Example: time complexity for Linear search can be represented as O(n) and O(log n) for Binary search (where, n and log(n) are the number of operations).


2 Answers

The average running time of SAT for randomly-generated 3CNF clauses eventually goes down as the ratio of clauses to variables increases. The intuition is that when there are very many clauses relative to the number of variables, it is more likely that the formula is "obviously" unsatisfiable; that is, typical SAT-solving algorithms (a step or two better than exhaustive search, but simple enough to cover in an undergrad logic course) quickly reach contradictions and stop.

Of course, those are experimental observations for some notion of "random" 3CNF formulas. I'm not sure what people have proven about it.

like image 25
Ryan Culpepper Avatar answered Jan 14 '23 17:01

Ryan Culpepper


The Boyer-Moore string search algorithm gets faster when the string searched for gets longer. Of course, the limiting factor is most often rather the length of the string searched in.

like image 102
Svante Avatar answered Jan 14 '23 16:01

Svante