Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I handle a leap second correctly in my application

I am creating applications and I'd like to know how I should/can handle a leap second. I'll try to describe the problem with a (hopefully) simple example situation. In this simple situation you could easily argue that an extra second of waiting time each +- 1.5 years may not be relevant but I still would sleep better knowing it works correctly/the way I want it too in 'all' situations :)


Situation


(We don't take any delays into account, so when a users presses a button it 'happens' immediately)

You have a game in which you create soldiers.

  • Each soldier takes 110 seconds to create.
  • The users presses the button to create 1 soldier at day 1 - 23:58:30
  • The user then expects the soldiers to be created 110s later. Trough a normal 'DateTime calculation' your application would end up with day 2 - 00:00:20.

Now there happens to be a leap second between day 1 and day 2 --> 23:59:60. Following this method the user will in actuality have waited 111s for his soldier.

I'd prefer to use the Unix timestamp. This way you only have to add 110s to the current time in seconds. But as far as I know this doesn't take leap seconds into account either. You'd still end up waiting 111s in the actual time.


Question


What should I do to make sure a user or program is only waiting the time it should wait?

Is there a commonly used time stamp that does take leap seconds into account?

Should I always check if a leap second is ocurring? (may cause a lot of "waist" of cpu power?)

EDIT: I mostly work in Javascript (Node.js) but an example in C, php or Python will work fine too!

like image 416
Dex Avatar asked Jun 22 '16 16:06

Dex


3 Answers

UTC time differs from atomic time in preciselly those leap seconds. Without a reference to atomic time, you cannot detect from only UTC when a leap second has been inserted. This makes the time in UTC "almost" continous, because of these little jumps of more than 0.5s and less than 1.0s that happen to disturb the time, when they happen. UTC is defined as such to conserve calendar time and match it with earth movement, but despite of this, it can be considered continous by simply ignoring these leap seconds.

When a leap second is inserted, you simply note nothing in your clock, as only the atomic to utc clock difference has been corrected. Only in the case you calculate planet or sattelite orbits, you have to correct to atomic clock, or you'll be inserting the whole leapsecond in your calculations (and make your orbits to change) The actual effect is as UTC has been advanced one second by a phantom. This only affects the numbering of seconds, that lose one more in the difference between atomic time and UTC. But you notice nothing happening in your machine. The difference in real time for timestamps that happen to be overlapped with the occurrence of the leap second are not affected, as the leap second insertion only affects the difference in numbering of atomic seconds and utc seconds.

In the case you were to consider time that overlaps the leap second (for example, in the calculus of orbital parameters of some extraterrestrial vehicle), then you have to take this extra second (that has not been counted for in atomic clock and yes in UTC time) and add it to the interval, or your calculus will be incorrect. But astrophysics always do their calculus using the right timescale, so don't hesitate about space rubbish falling on your head by mistake.

If you use clock synchronization programs (like ntpd, for example) then the insertion of the leap second can happen in several ways:

  • The second is inserted by adding one to the system clock at exactly the time of the leap clock insertion. This makes weird things to happen, as timeout delays on these times are severely affected by the clock adjust that has been made.

  • The clock is adjusted fast some time before the leap second (let's say two hours before) and adjusted normal again some time after the leap second. You'll have a continous scale of time, with seconds of a duration a little less than standard atomic time to adjust the time.

  • Let the clock run. The clock loop suddenly "sees" a one second offset between the reference (the "new" utc scale) and its internal clock, so it begins to correct it in a normal way (adjusting the clock speed). This is the same as the previous one, but leads to more offset differences (one whole second against half for the previous point)

I don't know at this moment if your clock synch application follows the first or the second approach. At least in linux I think the one used is the second or the third, and normally you don't notice it.

NOTE

In the case you put as example, and supposing the worst case (the time synchronizes by making a step adjustment in the clock) you'll get the soldier created in 111s instead of 110s, but it is nothing to be worried upon, as everything in that lapse has happened in 111s instead of 110. You'll get a 1.0% more time to get your soldier, but everything has happened a 1.0% slower in that same period, and your soldier has no actual penalties from other soldiers that happened to be conceived before or after the leap second.

And finally, if you don't use time synchronization programs, your clock will suffer more from it's actual offset (the difference of your clock's time and the actual time), than from the drift of making step adjustments to it.

like image 126
Luis Colorado Avatar answered Nov 05 '22 16:11

Luis Colorado


You really should use timestamp here.

Timestamps are just number of seconds (all seconds, doesn't matter leap or not) that pass from predefined date (called "epoch"). It's just a counter.

It's immune to leap seconds, summer/winter time zone change and even insane government that change time zones border every year.

With timestamp, you always can calculate what time is now in UTC, GMT and Europe/Moskow (with leap seconds and with out them, it depends on your tz_data configuration). Reverse operation some times impossible to do.

like image 24
Arnial Avatar answered Nov 05 '22 17:11

Arnial


You cannot do that in javascript. There are no leap seconds in javascript. ECMAScript 5, section 15.9.1.1 says:

Time is measured in ECMAScript in milliseconds since 01 January, 1970 UTC. In time values leap seconds are ignored. It is assumed that there are exactly 86,400,000 milliseconds per day

Time values aren't guaranteed to be monotonic, and when a leap second occurs, they most likely won't be. In POSIX, you could use clock_gettime with a clk_id of CLOCK_MONOTONIC (if _POSIX_MONOTONIC_CLOCK is defined).

like image 28
zayjuc Avatar answered Nov 05 '22 18:11

zayjuc