Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How precise is the internal clock of a modern PC?

I know that 10 years ago, typical clock precision equaled a system-tick, which was in the range of 10-30ms. Over the past years, precision was increased in multiple steps. Nowadays, there are ways to measure time intervals in actual nanoseconds. However, usual frameworks still return time with a precision of only around 15ms.

My question is, which steps did increase the precision, how is it possible to measure in nanoseconds, and why are we still often getting less-than-microsecond precision (for instance in .NET).

(Disclaimer: It strikes me as odd that this was not asked before, so I guess I missed this question when I searched. Please close and point me to the question in that case, thanks. I believe this belongs on SO and not on any other SOFU site. I understand the difference between precision and accuracy.)

like image 298
mafu Avatar asked Apr 09 '10 12:04

mafu


People also ask

How accurate is the clock on my PC?

Unfortunately all the common clock hardware is not very accurate. This is simply because the frequency that makes time increase is never exactly right. Even an error of only 0.001% would make a clock be off by almost one second per day.

Do computers have internal clocks?

Computers have a low-power internal clock that runs when the machine is powered off. It's called a CMOS clock. It essentially uses the same amount of power as a wrist watch and stores that in a low-power memory chip and updates your machine when it powers back on.

How much do computer clocks drift?

Abstract. Most computers have several high-resolution timing sources, from the programmable interrupt timer to the cycle counter. Yet, even at a precision of one cycle in ten millions, clocks may drift significantly in a single second at a clock frequency of several GHz.

How does a computer internal clock work?

In general, the clock refers to a microchip that regulates the timing and speed of all computer functions. In the chip is a crystal that vibrates at a specific frequency when electricity is applied. The shortest time any computer is capable of performing is one clock, or one vibration of the clock chip.


1 Answers

It really is a feature of the history of the PC. The original IBM-PC used a chip called the Real Time Clock which was battery backed up (Do you remember needing to change the batteries on these ?) These operated when the machine was powered off and kept the time. The frequency of these was 32.768 kHz (2^15 cycles/second) which made it easy to calculate time on a 16 bit system. This real time clock was then written to CMOS which was available via an interrupt system in older operating systems.

A newer standard is out from Microsoft and Intel called High Precision Event Timer which specifies a clock speed of 10MHz http://www.intel.com/hardwaredesign/hpetspec_1.pdf Even newer PC architectures take this and put it on the Northbridge controller and the HPET can tun at 100MHz or even greater. At 10Mhz we should be able to get a resolution of 100 nano-seconds and at 100MHZ we should be able to get 10 nano-second resolution.

The following operating systems are known not to be able to use HPET: Windows XP, Windows Server 2003, and earlier Windows versions, older Linux versions

The following operating systems are known to be able to use HPET: Windows Vista, Windows 2008, Windows 7, x86 based versions of Mac OS X, Linux operating systems using the 2.6 kernel and FreeBSD.

With a Linux kernel, you need the newer "rtc-cmos" hardware clock device driver rather than the original "rtc" driver

All that said how do we access this extra resolution? I could cut and paste from previous stackoverflow articles, but not - Just search for HPET and you will find the answers on how to get finer timers working

like image 75
Romain Hippeau Avatar answered Oct 02 '22 22:10

Romain Hippeau