Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is UNIX time universal

I did some research on internet but still confused. Is UNIX time universal time like GMT/UTC or does it vary from place to place like a local time?

I know UNIX time is counted from 1st Jan, 1970 00:00:00 GMT. When I use getTime() function in Java (more specifically Date d= new Date(); long currentTime d.getTime()) I am getting the UNIX time in milliseconds. Now If person A and person B use the same function who are sitting in two different time zones, will they get the same result?

like image 477
user4213837 Avatar asked Apr 23 '15 07:04

user4213837


People also ask

Is Unix time the same everywhere?

The UNIX timestamp is the number of seconds (or milliseconds) elapsed since an absolute point in time, midnight of Jan 1 1970 in UTC time. (UTC is Greenwich Mean Time without Daylight Savings time adjustments.) Regardless of your time zone, the UNIX timestamp represents a moment that is the same everywhere.

Is Unix epoch a UTC?

January 1st, 1970 at 00:00:00 UTC is referred to as the Unix epoch.

What timezone is Unix in?

Notice that UNIX Epoch is UTC so it identifies without errors a specific moment in time. Never ask about the timezone of a UNIX epoch timestamp, it is UTC by definition.

Is Unix time accurate?

UNIX time counts the number of seconds since an ``epoch. '' This is very convenient for programs that work with time intervals: the difference between two UNIX time values is a real-time difference measured in seconds, within the accuracy of the local clock. Thousands of programmers rely on this fact.


2 Answers

Now If person A and person B use the same function who are sitting in two different time zones, will they get the same result?

Yes, they will - assuming their clocks are both "correct" of course.

The java.util.Date class is basically a wrapper around "the time since the Unix epoch, in milliseconds". Given that the Unix epoch was an instant in time (not just "midnight on January 1st 1970", the number of elapsed milliseconds is the same wherever you are. (Ignoring relativity and any discussion of leap seconds...)

(Side-note: at the Unix epoch, it wasn't midnight in Greenwich. It was 1am, because the UK was observing BST at the time. That's British Standard Time, not British Summer Time - the UK was at UTC+1 from Feb 18th 1968 to October 31st 1971. For more similar trivia, see the Noda Time user guide trivia page.)

like image 77
Jon Skeet Avatar answered Sep 27 '22 17:09

Jon Skeet


The answer by Jon Skeet is correct. I'll add a few thoughts.

Unix time means different things to different people. As that Wikipedia article describes, the basic idea is usually a count of seconds since epoch, with epoch being the first moment of 1970 in the UTC time zone. As the name suggests, this approach to time tracking was used in Unix-like operating systems.

Locality

Does it vary by locality? No. By definition, it represents UTC time zone. So a moment in Unix time means the same simultaneous moment in Auckland, Paris, and Montréal. The UT in UTC means "Universal Time".

Is Unix time universal in the sense of used everywhere? No, certainly not.

Granularity

First, the granularity. As computer clock chips became more precise, conventional computer systems moved to tracking time by millisecond, microsecond, and even nanosecond. Different software assumes different granularity of time tracking. The java.util.Date/.Calendar classes and Joda-Time library both use millisecond resolution, while the newer java.time package built into Java 8 assumes nanosecond resolution. Some databases such as Postgres typically assume microsecond resolution.

Diagram of time-tracking granularity ranging from whole seconds to nanoseconds.

To quote the Question…

I am getting the UNIX time in milliseconds

Technically a contradiction in terms, as traditional Unix time or POSIX time is tracked by whole seconds rather than milliseconds.

Epoch

Secondly, the epoch. The first moment of 1970 is far from the only epoch used by various computer systems. A couple dozen epochs have been used, some with very wide usage. For example, Microsoft Excel and Lotus 1-2-3 spreadsheets, Cocoa, GPS satellites, Galileo satellites, DOS & FAT file systems, and ntp (Network Time Protocol) each using a different epoch ranging from the years 1899 to 2001.

Avoid Count-From-Epoch

Generally best to avoid focusing on handling date-time values by counting milliseconds (or any granularity) from epoch. Such values are difficult to read and comprehend by humans thereby making debugging difficult and mistakes non-obvious. Add on the possible mistakes from assumptions about the granularity and/or epochs discussed above.

Instead use a decent date-time library. In Java that means either:

  • Joda-Time
  • java.time

Do you track text by collecting groups of 7 or 8 bits? No, you use classes and libraries to do the heavy-lifting of handling character sets, character encoding, and such. Do the same for date-time work.

like image 37
Basil Bourque Avatar answered Sep 27 '22 18:09

Basil Bourque