Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why is 1/1/1970 the "epoch time"?

Tags:

unix

time

epoch

People also ask

Why is January 1 1970 the epoch?

January 1st, 1970 at 00:00:00 UTC is referred to as the Unix epoch. Early Unix engineers picked that date arbitrarily because they needed to set a uniform date for the start of time, and New Year's Day, 1970, seemed most convenient.

Why do computers count time from 1970?

It all began on 1 January 1970 Computers don't count days, months or years, etc. Instead, they work on the number of seconds elapsed since midnight proleptic Coordinated Universal Time (UTC) of January 1, 1970, not counting leap seconds. Why that date? Because it was convenient for early Unix engineers to work with.

What does Epoch time represent?

In a computing context, an epoch is the date and time relative to which a computer's clock and timestamp values are determined. The epoch traditionally corresponds to 0 hours, 0 minutes, and 0 seconds (00:00:00) Coordinated Universal Time (UTC) on a specific date, which varies from system to system.

When did epoch start?

The Unix epoch is 00:00:00 UTC on 1 January 1970.


Early versions of unix measured system time in 1/60 s intervals. This meant that a 32-bit unsigned integer could only represent a span of time less than 829 days. For this reason, the time represented by the number 0 (called the epoch) had to be set in the very recent past. As this was in the early 1970s, the epoch was set to 1971-1-1.

Later, the system time was changed to increment every second, which increased the span of time that could be represented by a 32-bit unsigned integer to around 136 years. As it was no longer so important to squeeze every second out of the counter, the epoch was rounded down to the nearest decade, thus becoming 1970-1-1. One must assume that this was considered a bit neater than 1971-1-1.

Note that a 32-bit signed integer using 1970-1-1 as its epoch can represent dates up to 2038-1-19, on which date it will wrap around to 1901-12-13.


History.

The earliest versions of Unix time had a 32-bit integer incrementing at a rate of 60 Hz, which was the rate of the system clock on the hardware of the early Unix systems. The value 60 Hz still appears in some software interfaces as a result. The epoch also differed from the current value. The first edition Unix Programmer's Manual dated November 3, 1971 defines the Unix time as "the time since 00:00:00, Jan. 1, 1971, measured in sixtieths of a second".


http://en.wikipedia.org/wiki/Unix_time#History explains a little about the origins of Unix time and the chosen epoch. The definition of unix time and the epoch date went through a couple of changes before stabilizing on what it is now.

But it does not say why exactly 1/1/1970 was chosen in the end.

Notable excerpts from the Wikipedia page:

The first edition Unix Programmer's Manual dated November 3, 1971 defines the Unix time as "the time since 00:00:00, Jan. 1, 1971, measured in sixtieths of a second".

Because of [the] limited range, the epoch was redefined more than once, before the rate was changed to 1 Hz and the epoch was set to its present value.

Several later problems, including the complexity of the present definition, result from Unix time having been defined gradually by usage rather than fully defined to start with.


Epoch reference date

An epoch reference date is a point on the timeline from which we count time. Moments before that point are counted with a negative number, moments after are counted with a positive number.

Many epochs in use

Why is 1 January 1970 00:00:00 considered the epoch time?

No, not the epoch, an epoch. There are many epochs in use.

This choice of epoch is arbitrary.

Major computers systems and libraries use any of at least a couple dozen various epochs. One of the most popular epochs is commonly known as Unix Time, using the 1970 UTC moment you mentioned.

While popular, Unix Time’s 1970 may not be the most common. Also in the running for most common would be January 0, 1900 for countless Microsoft Excel & Lotus 1-2-3 spreadsheets, or January 1, 2001 used by Apple’s Cocoa framework in over a billion iOS/macOS machines worldwide in countless apps. Or perhaps January 6, 1980 used by GPS devices?

Many granularities

Different systems use different granularity in counting time.

Even the so-called “Unix Time” varies, with some systems counting whole seconds and some counting milliseconds. Many database such as Postgres use microseconds. Some, such as the modern java.time framework in Java 8 and later, use nanoseconds. Some use still other granularities.

ISO 8601

Because there is so much variance in the use of an epoch reference and in the granularities, it is generally best to avoid communicating moments as a count-from-epoch. Between the ambiguity of epoch & granularity, plus the inability of humans to perceive meaningful values (and therefore miss buggy values), use plain text instead of numbers.

The ISO 8601 standard provides an extensive set of practical well-designed formats for expressing date-time values as text. These formats are easy to parse by machine as well as easy to read by humans across cultures.

These include:

  • Date-only: 2019-01-23
  • Moment in UTC: 2019-01-23T12:34:56.123456Z
  • Moment with offset-from-UTC: 2019-01-23T18:04:56.123456+05:30
  • Week of week-based-year: 2019-W23
  • Ordinal date (1st to 366th day of year): 2019-234