I would have quite general question. Have you ever had to really compute(e.g on the paper) complexity of an algorithm except at school as a programmer? And if.. can you give me an example please.
thank you :)
If your algorithm runs in a time proportional to the logarithm of the input data size, that is log ( n ) \log(n) log(n), then you have O ( log ( n ) ) \mathcal{O}(\log(n)) O(log(n)) complexity. This type of complexity is usually present in algorithms that somehow divide the input size.
O(n2) represents a function whose complexity is directly proportional to the square of the input size. Adding more nested iterations through the input will increase the complexity which could then represent O(n3) with 3 total iterations and O(n4) with 4 total iterations.
Algorithmic complexity is concerned about how fast or slow particular algorithm performs. We define complexity as a numerical function T(n) - time versus the input size n. We want to define time taken by an algorithm without depending on the implementation details.
Linear time is the best possible time complexity in situations where the algorithm has to sequentially read its entire input.
If you're writing a piece of software and you can think of multiple ways to implement it, often one of the deciding factors (in addition to conceptual complexity and time to implement) is going to be the algorithmic complexity. Thus, when your boss wants justification for your decision, figuring out the complexity of each is necessary. While some may consider this a form of premature optimization, I think the consensus is that choosing a design appropriate for your problem is just good software engineering.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With