Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Conversion from UNIX time to timestamp starting in January 1, 2000

I am trying to interact with an API that uses a timestamp that starts at a different time than UNIX epoch. It appears to start counting on 2000-01-01, but I'm not sure exactly how to do the conversion or what the name of this datetime format is.

When I send a message at 1456979510 I get a response back saying it was received at 510294713.
The difference between the two is 946684796 (sometimes 946684797) seconds, which is approximately 30 years.

Can anyone let me know the proper way to convert between the two? Or whether I can generate them outright in Python?

Thanks

Edit

An additional detail I should have mentioned is that this is an API to a Zigbee device. I found the following datatype entry in their documentation:

1.3.2.7 Absolute time
This is an unsigned 32-bit integer representation for absolute time. Absolute time is measured in seconds from midnight, 1st January 2000.

I'm still not sure the easiest way to convert between the two

like image 663
Tom Sitter Avatar asked Mar 03 '16 04:03

Tom Sitter


People also ask

How do I manually convert a date to a timestamp in Unix?

To easily convert UNIX timestamp to date in the . csv file, do the following: 1. =R2/86400000+DATE(1970,1,1), press Enter key.

Why is January 1 1970 the epoch?

January 1st, 1970 at 00:00:00 UTC is referred to as the Unix epoch. Early Unix engineers picked that date arbitrarily because they needed to set a uniform date for the start of time, and New Year's Day, 1970, seemed most convenient.

How do I convert Unix timestamp to UTC time?

Simply multiply Unix timestamp by 1000 to convert it to a JavaScript time, because Unix timestamp measures time as a number of seconds, whereas in JavaScript time is fundamentally specified as the number of milliseconds (elapsed since January 1, 1970 at 00:00:00 UTC).


5 Answers

The time 1 January 1970 00:00:00 is considered the UNIX epoch. So, if you want to convert from UNIX time to a timestamp having an epoch of January 1, 2000 (Let's say, 2000 epoch) the simplest way would be to simply subtract the UNIX time of January 1, 2000 from the UNIX time.

<2000 time> = <UNIX time> - <January 1, 2000 UNIX time>

<UNIX time> = <2000 time> + <January 1, 2000 UNIX time>

Where January 1, 2000 UNIX time is 946684800.

EDIT: The docs does say

Absolute time is measured in seconds from midnight, 1st January 2000.

So, 946684800 is the exact time difference which should be used to calculate. The few seconds difference that you calculated could be attributed to network delay or some other delays.

like image 95
JRodDynamite Avatar answered Oct 05 '22 07:10

JRodDynamite


EDIT: The OP added details specifying that the time starts at midnight and hence it is absolute time different from J2000 which starts at noon. But since the title states "timestamp starting in January 1, 2000" I am letting this answer to be for future answer seekers.

ANSWER:

The timestamp you have mentioned appears to be the J2000.0 mentioned here

Since the Unix and J2000 epoch times are constants, you could define a constant to store the difference.

If you have a mathematical inclination the following links provide some info regarding the conversion

  1. http://www.giss.nasa.gov/tools/mars24/help/algorithm.html Refer to step A-2
  2. http://onegeek.org/software/smeg/current/src/time.c (C file),

    the #define section in the C file contains the following

    #define J2000 2451545.0 /* you-know-when */

    #define U1970 -10957.5 /* unix epoch relative to J2000 */

like image 44
raghav710 Avatar answered Oct 05 '22 07:10

raghav710


Well, there are 946684800 seconds between 2000-01-01T00:00:00Z and 1970-01-01T00:00:00Z. So, you can just set a constant for 946684800 and add or subtract from your Unix timestamps.

The variation you are seeing in your numbers has to do with the delay in sending and receiving the data, and could also be due to clock synchronization, or lack thereof. Since these are whole seconds, and your numbers are 3 to 4 seconds off, then I would guess that the clocks between your computer and your device are also 3 to 4 seconds out of sync.

like image 36
Matt Johnson-Pint Avatar answered Oct 05 '22 06:10

Matt Johnson-Pint


The differences in the two times is indeed 30 years:

>>> import datetime
>>> d1 = datetime.datetime.fromtimestamp(1456979510)
>>> d1.ctime()
'Wed Mar  2 20:31:50 2016'
>>> d2 = datetime.datetime.fromtimestamp(510294713)
>>> d2.ctime()
'Mon Mar  3 20:31:53 1986'

Creating a variable to hold the difference conversion either way can be done:

>>> conv_factor = (d1 - d2).total_seconds()
>>> conv_factor
946684797.0
>>> conv_time = d2 + datetime.timedelta(seconds=conv_factor)
>>> conv_time
datetime.datetime(2016, 3, 2, 20, 31, 50)
>>> conv_time.ctime()
'Wed Mar  2 20:31:50 2016'

Subtracting the conv_factor works to convert the other direction.

like image 27
ChrisFreeman Avatar answered Oct 05 '22 05:10

ChrisFreeman


Unix epoch: 1970-1-1, 00:00:00 UTC

J2000 epoch: 2000-1-1, 12:00:00 UTC

The time gap is: 946684800 in seconds

The pitfall is that python will use local timezone for conversion from datetime to timestamp. As shown below:

import datetime
import time, calendar
dt = datetime.datetime(1970, 1, 1, 0, 0) # datetime.datetime, default is computer's local time zone
tt = dt.timetuple()
print(tt) # time.struct_time
print("Unix Timestamp: ",time.mktime(tt))      # struct_time in local time
print("Unix Timestamp: ",calendar.timegm(tt))  # struct_time in UTC

So if you want to convert timestamp back to datetime, use these codes:

t = 0  # Unix timestamp
t2000 = t+946728000
#value = datetime.datetime.fromtimestamp(t2000)   # from local time
dt= datetime.datetime.utcfromtimestamp(t2000)     # from UTC time
print(dt.timetuple())
print(dt.strftime('%Y-%m-%d %H:%M:%S'))

There is a nice conversion tool online: https://www.unixtimestamp.com/index.php

like image 26
Yuchao Jiang Avatar answered Oct 05 '22 05:10

Yuchao Jiang