What is the number of one second ticks between Unix time epoch (01 Jan 1970) and GPS time epoch (06 Jan 1980)?
I have seen multiple answers from several sources on the web. One camp claims the answer is 315964800, the other claims it is 315964819. I always thought it was 315964800, but now am not so sure.
I just found my software baseline has been using 315964819 for the last eight years. I have a hard time understanding how it could have been 19 seconds off and no one noticed it when we integrated our embedded devices with other devices.
I think that whoever put 315964819 in the code baseline must have mistakenly used a TAI offset (19 seconds).
From what I understand, Unix time does not include leap seconds, which would indicate to me that 315964800 is the number of ticks between the two epochs. Then I think about how Unix time handles the leap second. It simply repeats the tick count when there is a leap second inserted, and there were 19 leap seconds inserted between 1970 and 1980... I start to wonder if the repeated ticks matter. I do not think so, but someone in this code's history thought so, and it seemed to work....
The long and short of it is I am about to change a constant set in the dark ages of this product that has to do with timing, which is important for the platform, from what it had been to what I believe is more accurate, and I wanted some sort of thumbs-up from more knowledgeable people than me.
Can someone authoritative please step in here?
315964800 camp
315964819 camp
Also note that I'm only asking about Unix epoch to GPS epoch. I'm pretty sure we've got leap seconds since GPS epoch covered appropriately.
GPS Epoch is a continuous time system for all satellites and observation systems. It is represented in seconds since Jan 5, 1980 (presumably when GNSS went online).
Although UNIX time and epoch time are often used synonymously, they mean different things. Literally speaking, the epoch represents UNIX time 0 (midnight at the start of 1 January 1970). UNIX time, or the UNIX timestamp, refers to the number of seconds that have elapsed since the epoch.
In computing, Unix time (also known as Epoch time, Posix time, seconds since the Epoch, Unix timestamp or UNIX Epoch time) is a system for describing a point in time.
GPS reference time is referenced to UTC with zero point defined as midnight on the night of January 5, 1980. The time stamp consists of the number of weeks since that zero point and the number of seconds since the last week number change (0 to 604,799).
The different values you stated are caused by mixing up the 1970 to 1980 offset with leap seconds.
The correct offset value is 315964800 seconds.
Explanation:
UTC and GPS time deviate (on average) every 18 months by one additional second. This is called a leap second, introduced in UTC time base, necessary to adjust for changes in the earth's rotation.
GPS Time not adjusted by leap seconds.
Currently (2013) there is an offset of 16s:
GPS Time-UTC = 16 seconds
Unix time is a time format not a time reference. It represents the number of milliseconds (or seconds) since 1.1.1970 UTC. Ideally your system time is synchronized with UTC by a TimeServer (NTP).
To convert, and get your offset, you should use a fixed offset: (6.1.1980 UTC - 1.1.1970 UTC)
and THEN add the current value of GPS to UTC deviation (currently 16s). E.g make that value configurable, or read the current offset from a GPS device (they know the difference between UTC and GPS Time)
The different values you stated are caused by mixing up 1970 to 1980 offset with leap seconds. Dont do that, handle them separately.
This java program:
SimpleDateFormat df = new SimpleDateFormat(); df.setTimeZone(TimeZone.getTimeZone("UTC")); Date x = df.parse("1.1.1970 00:00:00"); Date y = df.parse("6.1.1980 00:00:00"); long diff = y.getTime() - x.getTime(); long diffSec = diff / 1000; System.out.println("diffSec= " + diffSec);
Outputs this value:
diffSec= 315964800
So this is the correct offset between 1.1.1970 UTC and 6.1.1980 UTC where GPS Time began. Then you have to correct further 16 seconds which were introduced since 6.1.1980 and today, to calculate the GPS Time of a current UTC time.
What is the number of one second ticks between Unix time epoch (01 Jan 1970) and GPS time epoch (06 Jan 1980)?
There are at least two possible answers:
What is POSIX timestamp for 1980-01-06 UTC? Answer: 315964800
(exactly), in Python:
from datetime import datetime, timedelta print((datetime(1980,1,6) - datetime(1970,1,1)) // timedelta(seconds=1))
It is the number of SI seconds between the dates not counting leap seconds. In other words, the code shows how many "UT1" seconds (~1/86400 of a mean-solar day) passed between the events.
UTC, GPS time scales tick in SI seconds. UTC "forgets" leap seconds and therefore the actual number of SI seconds between the dates is slightly larger than the POSIX timestamp.
315964800
is not the correct answer if you want to find elapsed seconds
How many SI seconds elapsed between 1970-01-01 UTC and 1980-01-06 UTC? Answer: 315964811
(approximately).
To answer the second question, you need to know how many intercalary leap seconds were inserted between the two dates (convert UTC to the International Atomic Time (TAI)):
#!/usr/bin/env python3 from datetime import datetime, timedelta tai_posix_epoch = datetime(1970, 1, 1) + timedelta(seconds=8, microseconds=82) tai_gps_epoch = datetime(1980, 1, 6) + timedelta(seconds=19) print(round((tai_gps_epoch - tai_posix_epoch) / timedelta(seconds=1)))
The difference between TAI and GPS time is constant within 10s of nanoseconds.
The time between 1970 and 1972 (when UTC was introduced) is a little fuzzy; the TAI-UTC difference is not integer number of seconds in that period:
from decimal import Decimal as D MJD_1970_01_01 = 40587 dAT_1970_01_01 = D("4.213170") + (MJD_1970_01_01 - 39126) * D("0.002592") # -> delta(AT) = TAI - UTC = Decimal('8.000082') # 8 seconds, 82 microseconds
Here's a picture that shows the relation between UT1, UTC, and TAI time scales over the years: Each step is a leap second starting with TAI - UTC = 10s on 1972-01-01. 26 positive leap seconds had been inserted as of 1 July 2015.
315964819
timestamp could be explained if 1970-01-01 00:00:00 TAI
epoch is used:
print(datetime(1970, 1, 1) + timedelta(seconds=315964819)) # TAI # 1980-01-06 00:00:19 TAI or 1980-01-06 00:00:00 UTC
i.e., exactly 315964819
SI seconds elapsed between 1970-01-01 00:00:00 TAI
and 1980-01-06 00:00:00 UTC
(note: the dates are expressed using different time scales).
"right" timezones use 1970-01-01 00:00:10 TAI
epoch (notice: 10 seconds) and therefore the corresponding timestamp for the GPS epoch (1980-01-06 00:00:00 UTC) is 315964809
(not 315964819
). Here's a succinct description of the difference between "right" and POSIX timestamps:
The "right" files in the tz (zoneinfo) database have a subtle difference from the POSIX standard. POSIX requires that the system clock value of
time_t
represent the number of non-leap seconds since 1970-01-01. This is the same as requiring POSIX seconds to be mean solar seconds of UT, not the atomic seconds that UTC has counted since 1972-01-01.The "right" zoneinfo files assert that the system clock value of
time_t
represent the actual number of seconds in the internationally approved broadcast time scale since 1970-01-01. As a result the value of time_t which is expected by the "right" zoneinfo files is greater than the value of time_t specified by POSIX. The difference in the values of time_t is the number of leap seconds which have been inserted into the internationally approved broadcast time scale. As of year 2015 the difference is 26 seconds.emphasize is mine
Can someone authoritative please step in here?
IERS BULLETIN C (the data that I've used above) is the authority on leap seconds (and therefore (indirectly) on the difference between UTC and GPS time scales).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With