I've just been reading this mind-blowing and hilarious post about some common falsehoods regarding time. Number forty is:
Every integer is a theoretical possible year
This implies that every integer is not a theoretical possible year. What is the negative case here? What integer is not a theoretically possible year?
Depending on the context, 0
is not a valid year number. In the Gregorian calendar we're currently using (and in its predecessor, the Julian calendar), the year 1 (CE/AD) was immediately preceded by the year -1 (1 BCE/BC). (For dates before the Gregorian calendar was introduced, we can use either the Julian calendar or the proleptic Gregorian calendar).
In a programming context, this may or may not be directly relevant. Different languages, libraries, and frameworks represent years in different ways. ISO 8601, for example, supports years from 0000
to 9999
, where 0000
is 1 BCE; wider ranges can be supported by mutual agreement. Some implementations of the C standard library can only represent times from about 1901 to 2038; others, using 64-bit time_t
can represent a much wider range, and typically treat -1
, 0
, and 1
as consecutive years.
Ultimately you'll need to check the documentation for whatever language/library/framework you're using.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With