Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Your favourite algorithm and the lesson it taught you [closed]

People also ask

What is the most popular algorithm?

Decision Tree algorithm in machine learning is one of the most popular algorithm in use today; this is a supervised learning algorithm that is used for classifying problems. It works well classifying for both categorical and continuous dependent variables.

What is the most important thing to remember about algorithms?

Algorithms are a very important topic in Computer Science because they help software developers create efficient and error free programs. The most important thing to remember about algorithms is that there can be many different algorithms for the same problem, but some are much better than others!

Why are algorithms so important today?

Algorithms are used in every part of computer science. They form the field's backbone. In computer science, an algorithm gives the computer a specific set of instructions, which allows the computer to do everything, be it running a calculator or running a rocket.


General algorithms:

  • Quicksort (and it's average complexity analysis), shows that randomizing your input can be a good thing!;
  • balanced trees (AVL trees for example), a neat way to balance search/insertion costs;
  • Dijkstra and Ford-Fulkerson algorithms on graphs (I like the fact that the second one has many applications);
  • the LZ* family of compression algorithms (LZW for example), data compression sounded kind of magic to me until I discovered it (a long time ago :) );
  • the FFT, ubiquitous (re-used in so many other algorithms);
  • the simplex algorithm, ubiquitous as well.

Numerical related:

  • Euclid's algorithm to compute the gcd of two integers: one of the first algorithms, simple and elegant, powerful, has lots of generalizations;
  • fast multiplication of integers (Cooley-Tukey for example);
  • Newton iterations to invert / find a root, a very powerful meta-algorithm.

Number theory-related:

  • AGM-related algorithms (examples): leads to very simple and elegant algorithms to compute pi (and much more!), though the theory is quite profound (Gauss introduced elliptic functions and modular forms from it, so you can say that it gave birth to algebraic geometry...);
  • the number field sieve (for integer factorization): very complicated, but quite a nice theoretical result (this also goes for the AKS algorithm, which proved that PRIMES is in P).

I also enjoyed studying quantum computing (Shor and Deutsch-Josza algorithms for example): this teaches you to think out of the box.

As you can see, I'm a bit biased towards maths-oriented algorithms :)


"To iterate is human, to recurse divine" - quoted in 1989 at college.

P.S. Posted by Woodgnome while waiting for invite to join


Floyd-Warshall all-pairs shortest paths algorithm

procedure FloydWarshall ()
   for k := 1 to n
     for i := 1 to n
        for j := 1 to n
          path[i][j] = min ( path[i][j], path[i][k]+path[k][j] );

Here's why it's cool: when you first learn about the shortest-path problem in your graph theory course, you probably start with Dijkstra's algorithm that solves single-source shortest path. It's quite complicated at first, but then you get over it, and you fully understood it.

Then the teacher says "Now we want to solve the same problem but for ALL sources". You think to yourself, "Oh god, this is going to be a much harder problem! It's going to be at least N times more complicated than Dijkstra's algorithm!!!".

Then the teacher gives you Floyd-Warshall. And your mind explodes. Then you start to tear up at how beautifully simple the algorithm is. It's just a triply-nested loop. It only uses a simple array for its data structure.


The most eye-opening part for me is the following realization: say you have a solution for problem A. Then you have a bigger "superproblem" B which contains problem A. The solution to problem B may in fact be simpler than the solution to problem A.


This one might sound trivial but it was a revelation for me at the time. I was in my very first programming class(VB6) and the Prof had just taught us about random numbers and he gave the following instructions: "Create a virtual lottery machine. Imagine a glass ball full of 100 ping pong balls marked 0 to 99. Pick them randomly and display their number until they have all been selected, no duplicates."

Everyone else wrote their program like this: Pick a ball, put its number into an "already selected list" and then pick another ball. Check to see if its already selected, if so pick another ball, if not put its number on the "already selected list" etc....

Of course by the end they were making hundreds of comparisons to find the few balls that had not already been picked. It was like throwing the balls back into the jar after selecting them. My revelation was to throw balls away after picking.

I know this sounds mind-numbingly obvious but this was the moment that the "programming switch" got flipped in my head. This was the moment that programming went from trying to learn a strange foreign language to trying to figure out an enjoyable puzzle. And once I made that mental connection between programming and fun there was really no stopping me.