I would like to know time as seconds since the epoch. Notably, I would not like it to matter where the machine doing the conversion is, the time zone string should be enough.
I have this test program, pt.cc:
#include <assert.h>
#include <errno.h>
#include <iostream>
#include <stdio.h>
#include <string>
#include <string.h>
#ifndef _XOPEN_SOURCE
#define _XOPEN_SOURCE
#endif
#include <time.h>
using namespace std; // To be brief, don't do this in real life.
int main(int argc, char* argv[]) {
(void)argc; (void)argv; // Skip compile warning.
// I expect both of these to transform to 1440671500.
cout << "1440671500 expected" << endl;
const char utc_example[] = "2015-08-27T11:31:40+0100";
struct tm tm;
memset(&tm, 0, sizeof(struct tm));
char* end = strptime(utc_example, "%Y-%m-%dT%H:%M:%S%z", &tm);
assert(end);
assert(*end == '\0');
time_t seconds_since_epoch = mktime(&tm);
cout << "utc example: " << seconds_since_epoch << " or maybe "
<< seconds_since_epoch - tm.tm_gmtoff + (tm.tm_isdst ? 3600 : 0) << endl;
const char tz_example[] = "2015-08-27T10:31:40Z";
memset(&tm, 0, sizeof(struct tm));
end = strptime(tz_example, "%Y-%m-%dT%H:%M:%S%nZ", &tm);
assert(end);
assert(*end == '\0');
seconds_since_epoch = mktime(&tm);
cout << " tz example: " << seconds_since_epoch << " or maybe "
<< seconds_since_epoch - tm.tm_gmtoff + (tm.tm_isdst ? 3600 : 0) << endl;
return 0;
}
This is the output:
jeff@birdsong:tmp $ clang++ -ggdb3 -Wall -Wextra -std=c++14 pt.cc -o pt
jeff@birdsong:tmp $ ./pt
1440671500 expected
utc example: 1440671500 or maybe 1440667900
tz example: 1440667900 or maybe 1440664300
jeff@birdsong:tmp $ TZ=America/New_York ./pt
1440671500 expected
utc example: 1440693100 or maybe 1440711100
tz example: 1440689500 or maybe 1440707500
jeff@birdsong:tmp $ TZ=Europe/London ./pt
1440671500 expected
utc example: 1440675100 or maybe 1440675100
tz example: 1440671500 or maybe 1440671500
jeff@birdsong:tmp $
Note how the return value of mktime()
changes depending on the ambient time zone. The man page entry for mktime()
suggests it interprets the broken down time as local time. So I tried subtracting the GMT offset and compensating for timezone in case it was ignoring those values (the "or maybe" value).
Any tips on how to do this correctly? (Should it matter, I only need this to work on linux.)
Convert from human-readable date to epochlong epoch = new java.text.SimpleDateFormat("MM/dd/yyyy HH:mm:ss").parse("01/01/1970 01:00:00").getTime() / 1000; Timestamp in seconds, remove '/1000' for milliseconds. date +%s -d"Jan 1, 1980 00:00:01" Replace '-d' with '-ud' to input in GMT/UTC time.
At this time there are ~63833279876 seconds since year 0 (6.383*1010). Visit the homepage to convert to/from seconds since 1/1/1970.
The UNIX epoch time is defined as the number of seconds passed since January 1st, 1970 at 00:00 UTC. Notice that UNIX Epoch is UTC so it identifies without errors a specific moment in time. Never ask about the timezone of a UNIX epoch timestamp, it is UTC by definition.
January 1st, 1970 at 00:00:00 UTC is referred to as the Unix epoch. Early Unix engineers picked that date arbitrarily because they needed to set a uniform date for the start of time, and New Year's Day, 1970, seemed most convenient.
This answer uses this date/time library:
http://howardhinnant.github.io/date/date.html
The approach taken here is to completely circumvent the C date/time API. Personally I find the C approach a bit confusing, cumbersome, and somewhat dangerous.
That being said, the parsing and formatting facilities in my date/time library are non-existent. I am foreseeing that such facilities may become a separate library, layered on top of my library in the future.
In the meantime, it is not difficult to roll your own parsing for this particular problem. Here is how:
#include "chrono_io.h"
#include "date.h"
#include <iostream>
#include <string>
#include <sstream>
using second_point = std::chrono::time_point<std::chrono::system_clock,
std::chrono::seconds>;
std::chrono::minutes
parse_offset(std::istream& in)
{
using namespace std::chrono;
char c;
in >> c;
minutes result = 10*hours{c - '0'};
in >> c;
result += hours{c - '0'};
in >> c;
result += 10*minutes{c - '0'};
in >> c;
result += minutes{c - '0'};
return result;
}
second_point
parse(const std::string& str)
{
std::istringstream in(str);
in.exceptions(std::ios::failbit | std::ios::badbit);
int yi, mi, di;
char dash;
// check dash if you're picky
in >> yi >> dash >> mi >> dash >> di;
using namespace date;
auto ymd = year{yi}/mi/di;
// check ymd.ok() if you're picky
char T;
in >> T;
// check T if you're picky
int hi, si;
char colon;
in >> hi >> colon >> mi >> colon >> si;
// check colon if you're picky
using namespace std::chrono;
auto h = hours{hi};
auto m = minutes{mi};
auto s = seconds{si};
second_point result = sys_days{ymd} + h + m + s;
char f;
in >> f;
if (f == '+')
result -= parse_offset(in);
else if (f == '-')
result += parse_offset(in);
else
;// check f == 'Z' if you're picky
return result;
}
int
main()
{
using namespace date;
std::cout << parse("2015-08-27T11:31:40+0100").time_since_epoch() << '\n';
std::cout << parse("2015-08-27T10:31:40Z").time_since_epoch() << '\n';
}
To be completely upfront, this solution is making major use of std::istringstream
, std::chrono
, and actually just a very small part of it is my date library.
There are several design choices I've made, which you may choose not to (there are so many options when parsing). For example, I chose to throw an exception if there are any parsing errors, and I chose to not be picky about checking delimiters such as -
and :
(mainly for brevity reasons).
The code is relatively self-explanatory. And as you state in your question, local timezones are not (and should not be) part of the solution. The chrono library is used to manage arithmetic among hours, minutes and seconds. And my date library is used to handle the conversion of a year/month/day into a chrono::time_point
that has a precision of days
.
With all this arithmetic handled for you, you can concentrate just on the parsing of integers and characters. It is straight forward to add more checking to this example, and do whatever you want to for errors.
This example outputs:
1440671500s
1440671500s
Update
I have since added parsing abilities to "date.h"
and the above parse
function can now be more simply written:
date::sys_seconds
parse(const std::string& str)
{
std::istringstream in(str);
date::sys_seconds tp;
in >> date::parse("%FT%TZ", tp);
if (in.fail())
{
in.clear();
in.str(str);
in >> date::parse("%FT%T%z", tp);
}
return tp;
}
int
main()
{
using namespace date;
std::cout << parse("2015-08-27T11:31:40+0100").time_since_epoch() << '\n';
std::cout << parse("2015-08-27T10:31:40Z").time_since_epoch() << '\n';
}
And returns the identical results.
Here's an answer that does what you want using Google's https://github.com/google/cctz
#include <chrono>
#include <iostream>
#include <string>
#include "src/cctz.h"
using namespace std;
int main(int argc, char* argv[]) {
const char kFmt[] = "%Y-%m-%dT%H:%M:%S%Ez";
// I expect both of these to transform to 1440671500.
const char utc_example[] = "2015-08-27T11:31:40+0100";
const char tz_example[] = "2015-08-27T10:31:40Z";
cout << "1440671500 expected" << endl;
// Required by cctz::Parse(). Only used if the formatted
// time does not include offset info.
const auto utc = cctz::UTCTimeZone();
std::chrono::system_clock::time_point tp;
if (!Parse(kFmt, utc_example, utc, &tp)) return -1;
cout << "utc example: " << std::chrono::system_clock::to_time_t(tp) << "\n";
if (!Parse(kFmt, tz_example, utc, &tp)) return -1;
cout << " tz example: " << std::chrono::system_clock::to_time_t(tp) << "\n";
return 0;
}
The output is:
1440671500 expected
utc example: 1440671500
tz example: 1440671500
Note that other answers that involved adding/subtracting offsets from, say, a time_t are using a technique called "epoch shifting" and it doesn't actually work. I explain why at 12:30 in this talk from CppCon: https://youtu.be/2rnIHsqABfM?t=12m30s
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With