Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What can be parameters other than time and space while analyzing certain algorithms?

I was interested to know about parameters other than space and time during analysing the effectiveness of an algorithms. For example, we can focus on the effective trap function while developing encryption algorithms. What other things can you think of ?

like image 707
sul4bh Avatar asked Mar 08 '10 14:03

sul4bh


People also ask

What are the different parameters to analyze an algorithm?

Speed is one of the key parameters in determining the potential of an algorithm. There are some other factors, like user-friendliness, security, maintainability, and usage space, that determine the quality of an algorithm. Space and time complexity are metrics used to measure parameters.

What are the parameters and their rules considered for analysis of an algorithm?

1.3 Analysis of Algorithms. A complete analysis of the running time of an algorithm involves the following steps: Implement the algorithm completely. Determine the time required for each basic operation. Identify unknown quantities that can be used to describe the frequency of execution of the basic operations.

What are the parameters to judge the efficiency of an algorithm?

The two main measures for the efficiency of an algorithm are time complexity and space complexity, but they cannot be compared directly. So, time and space complexity is considered for algorithmic efficiency. An algorithm must be analyzed to determine the resource usage of the algorithm.

What are the criteria for analyzing algorithms explain space complexity in detail?

Algorithm Complexity Space Factor − The space is calculated or measured by counting the maximum memory space required by the algorithm. The complexity of an algorithm f(N) provides the running time and / or storage space needed by the algorithm with respect of N as the size of input data.


4 Answers

First and foremost there's correctness. Make sure your algorithm always works, no matter what the input. Even for input that the algorithm is not designed to handle, you should print an error mesage, not crash the entire application. If you use greedy algorithms, make sure they truly work in every case, not just a few cases you tried by hand.

Then there's practical efficiency. An O(N2) algorithm can be a lot faster than an O(N) algorithm in practice. Do actual tests and don't rely on theoretical results too much.

Then there's ease of implementation. You usually don't need the best intro sort implementation to sort an array of 100 integers once, so don't bother.

Look for worst cases in your algorithms and if possible, try to avoid them. If you have a generally fast algorithm but with a very bad worst case, consider detecting that worst case and solving it using another algorithm that is generally slower but better for that single case.

Consider space and time tradeoffs. If you can afford the memory in order to get better speeds, there's probably no reason not to do it, especially if you really need the speed. If you can't afford the memory but can afford to be slower, do that.

If you can, use existing libraries. Don't roll your own multiprecision library if you can use GMP for example. For C++, stuff like boost and even the STL containers and algorithms have been worked on for years by an army of people and are most likely better than you can do alone.

like image 110
IVlad Avatar answered Sep 20 '22 22:09

IVlad


Stability (sorting) - Does the algorithm maintain the relative order of equal elements?

Numeric Stability - Is the algorithm prone to error when very large or small real numbers are used?

Correctness - Does the algorithm always give the correct answer? If not, what is the margin of error?

Generality - Does the algorithm work in many situation (e.g. with many different data types)?

Compactness - Is the program for the algorithm concise?

Parallelizability - How well does performance scale when the number of concurrent threads of execution are increased?

Cache Awareness - Is the algorithm designed to maximize use of the computer's cache?

Cache Obliviousness - Is the algorithm tuned for particulary cache-sizes / cache-line-sizes or does it perform well regardless of the parameters of the cache?

like image 37
Peter Alexander Avatar answered Sep 21 '22 22:09

Peter Alexander


Complexity. 2 algorithms being the same in all other respects, the one that's much simpler is going to be a much better candidate for future customization and use.

Ease of parallelization. Depending on your use case, it might not make any difference or, on the other hand, make the algorithm useless because it can't use 10000 cores.

like image 39
Tomislav Nakic-Alfirevic Avatar answered Sep 20 '22 22:09

Tomislav Nakic-Alfirevic


Stability - some algorithms may "blow up" with certain test conditions, e.g. take an inordinately long time to execute, or use an inordinately large amount of memory, or perhaps not even terminate.

like image 41
Paul R Avatar answered Sep 23 '22 22:09

Paul R