Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why is the order of an algorithm generally more important than the speed of the processor? [closed]

So, I know that the efficiency is determined by the algorithms and data structures used in a solution. But, I don't really get how the order of an algorithm more important than the speed of the processor?

like image 974
jimo Avatar asked Apr 18 '15 13:04

jimo


People also ask

Why is the order of algorithm more important than speed of the processor?

The complexity order of an algorithm is not directly linked to its speed, there may be "worse" algorithms that solve a specific problem instance faster than a "better" algorithm.

Why is it important to know the processor speed?

A computer's processor clock speed determines how quickly the central processing unit (CPU) can retrieve and interpret instructions. This helps your computer complete more tasks by getting them done faster.

What part of a processor is the most important factor to determine its performance?

Alongside clock speed, the architecture of a processor is the most important factor to determine its performance, and refers to its basic design and complexity.

What is an important factor for processing speed of a microprocessor?

Processor performance can be affected by clock speed, cache size and the number of cores the processor has.


1 Answers

A typical personal computer can do 10^8 calculations per second.
And the world's fastest supercomputer does 10^16 calculations per second.

So suppose you had an O(n) algorithm running on your laptop/desktop right now. And an O(n^2) algorithm running on the world's fastest supercomputer simultaneously. And if n = 10^10,

Running time on the PC = 10^10 / 10^8 = 100 seconds.
Running time on the supercomputer = 10^20 / 10^16 = 10000 seconds.

Thus clearly the laptop outperforms the supercomputer by a huge margin. And is infact 100 times faster while employing just a better algorithm.

Another reason we look for better algorithms is because of the scalability problem. According to moore's law, computing power doubles every 18 months. So even if a supercomputer can handle huge inputs really fast today, it may not be able to do so some time later when the problem size has increased manyfold while the computing power would have only doubled in the next 18 months.

like image 118
Nikunj Banka Avatar answered Dec 03 '22 09:12

Nikunj Banka