Is there a chart or table anywhere that displays a lot of(at least the popular ones) data structures and algorithms with their running times and efficiency?
What I am looking for is something that I can glance at, and decide which structure/algorithm is best for a particular case. It would be helpful when working on a new project or just as a study guide.
A chart or table isn't going to be a particularly useful reference.
If you're going to be using a particular algorithm or data structure to tackle a problem, you'd better know and understand it inside and out. And that includes knowing (and knowing how to derive) their respective efficiencies. It's not particularly difficult. Most standard algorithms have simple, intuitive run-times like N^2
, N*logN
, etc.
That being said, run-time Big-O isn't everything. Take sorting for example. Heap sort has a better Big-O than say quick sort, yet quick sort performs much better in practice. Constant factors in Big-O's can also make a huge difference.
When you're talking about data structures, there's a lot more to them than meets the eye. For example, a hash map seems like just a tree map with much better performance, but you get additional sorting structure with a tree map.
Knowing what is the best algorithm/data structure to use is a matter of knowledge experience, not a look up table.
Though back to your question, I don't know of any such reference. It would be a good exercise to make one yourself though. Wikipedia has pretty decent articles on common algorithms/data structures along with some decent analysis.
I don't believe that any such list exists. The sheer number of known algorithms and data structures is staggering, and new ones are being developed all the time. Moreover, many of these algorithms and data structures are specialized, meaning that even if you had a list in front of you it would be difficult to know which ones were applicable for the particular problems you were trying to solve.
Another concern with such a list is how to quantify efficiency. If you were to rank algorithms in terms of asymptotic complexity (big-O), then you might end up putting certain algorithms and data structures that are asymptotically optimal but impractically slow on small inputs ahead of algorithms that are known to be fast for practical cases but might not be theoretically perfect. As an example, consider looking up the median-of-medians algorithm for linear time order statistics, which has such a huge constant factor that other algorithms tend to be much better in practice. Or consider quicksort, which in the worst-case is O(n2) but in practice has average complexity O(n lg n) and is much faster than other sorting algorithms.
On the other hand, were you to try to list the algorithms by runtime efficiency, the list would be misleading. Runtime efficiency is based on a number of factors that are machine- and input-specific (such as locality, size of the input, shape of the input, speed of the machine, processor architecture, etc.) It might be useful as a rule-of-thumb, but in many cases you might be mislead by the numbers to pick one algorithm when another is far superior.
There's also implementation complexity to consider. Many algorithms exist only in papers, or have reference implementations that are not optimized or are written in a language that isn't what you're looking for. If you find a Holy Grail algorithm that does exactly what you want but no implementation for it, it might be impossibly difficult to code up and debug your own version. For example, if there weren't a preponderance of red/black tree implementations, do you think you'd be able to code it up on your own? How about Fibonacci heaps? Or (from personal experience) van Emde Boas trees? Often it may be a good idea to pick a simpler algorithm that's "good enough" but easy to implement over a much more complex algorithm.
In short, I wish a table like this could exist that really had all this information, but practically speaking I doubt it could be constructed in a way that's useful. The Wikipedia links from @hammar's comments are actually quite good, but the best way to learn what algorithms and data structures to use in practice is by getting practice trying them out.
Collecting all algorithms and/or data structures is essentially impossible -- even as I'm writing this, there's undoubtedly somebody, somewhere is inventing some new algorithm or data structure. In the greater scheme of things, it's probably not of much significance, but it's still probably new and (ever so slightly) different from anything anybody's done before (though, of course, it's always possible it'll turn out to be a big, important thing).
That said, the US NIST has a Dictionary of Algorithms and Data Structures that lists more than most people ever know or care about. It covers most of the obvious "big" ones that everybody knows, and an awful lot of less-known ones as well. The University of Canterbury has another that is (or at least seems to me) a bit more modest, but still covers most of what a typical programmer probably cares about, and is a bit better organized for finding an algorithm to solve a particular problem, rather than being based primarily on already knowing the name of the algorithm you want.
There are also various collections/lists that are more specialized. For example, The Stony Brook Algorithm Repository is devoted primarily (exclusively?) to combinatorial algorithms. It's based on the Algorithm Design Manual, so it can be particularly useful if you have/use that book (and in case you're wondering, this book is generally quite highly regarded).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With