Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How often do computers make mistakes?

I don't mean programming mistakes, which are actually done by a human somehow, but something when performing something as simple as adding two numbers.

What is the range of x to expect a mistake in 1/x?

like image 464
Quora Feans Avatar asked Jan 04 '14 16:01

Quora Feans


People also ask

Do computers make mistakes?

In reality, computer hardware consistently makes errors, but with the development of error detecting and correcting technology, we've been able to correct faulty data and utilize technology with a high fault tolerance.

Why do computers normally make mistakes?

Most mistakes that are made by a computer are, in fact, the result of mistakes made by a human programmer. Very rarely, computers make mistakes due to a hardware failure (e.g., a bad connection, faulty power supply, etc.) although those "mistakes" are usually catastrophic (a system crash).

What are common computer mistakes?

Not backing up files The most common mistake people make is not backing up their computer or entire business system infrastructure. Computer crashes, virus infections, hard drive failures, physical computer damage and theft can all happen and when you least expect them to.

Why do computers not make mistakes?

Computers are never wrong; they assemble the information put to them in a precise hierarchy based on our commands, and we use these features to find our way rapidly. The bottleneck to productivity is within the human, not the interaction between man and machine.


2 Answers

In terms of CPUs, there are three possible sources of mistakes which seem to be in the scope of your question:

  1. Floating point rounding errors. This seems to be what you are getting at with your division example. This type of error is completely deterministic in practice, not random at all! However, if the programming language you are using leaves floating point behaviour underspecified, you may get different errors on different computers.
  2. Design mistakes in a CPU, such as the infamous Intel Pentium FDIV bug. It's hard to put a probability on this, but fortunately modern CPUs are extensively tested, and even formal methods are used to mathematically prove their correctness to some extent.
  3. Hardware errors caused by radiation, such as cosmic rays. Unless you put your computer inside the reactor of a nuclear power station or something, the probability of errors caused by radiation should generally be negligible. Interestingly, this is actually relevant to certain programming techniques such as hashing in revision control systems. You can make the argument "Well, it's more likely that we get an error due to a cosmic ray, than a hash collision, so it's not worth worrying about the possibility of a hash collision".

Other components of a computer, such as storage devices and display devices, are much, much more likely to exhibit hardware errors leading to data corruption, than a CPU.

like image 102
Robin Green Avatar answered Nov 12 '22 15:11

Robin Green


Following on from @Robin Green's answers, there are actually a few more potential causes of hardware error besides cosmic rays:

  • Electrical Noise: Thermal noise is present in all electronic circuits, as are effects such as inductive coupling.
  • Quantum events: As features on semiconductors become ever smaller (particularly gate dielectrics) and the number of electrons involved in each state change becomes smaller, the finite (but small) probability of electrons being in high energy state and affecting a logic state get significant.

There are design solutions to all of these problems, but they come at a price we might not want to accept in terms of size, power consumption, integration density. Radiation hardened semi-conductors are notable in their low integration density, relatively performance (and high cost).

It's also worth noting that in communications and storage, hardware error are commonplace and rather than preventing them in the first place, the strategy is to recover from them with error detection and correction techniques.

like image 34
marko Avatar answered Nov 12 '22 15:11

marko