I have a single line of code in my app server code which gets me the timestamp value using steady_clock
as shown below:
uint64_t now = duration_cast<milliseconds>(steady_clock::now().time_since_epoch()).count();
Now we have two systems machineA which is running Ubuntu 12 (gcc 4.6.3 compiler)
and machineB which is running Ubuntu 14 (gcc 4.8.2 compiler)
.
Now we compile our app server code using make on another Ubuntu 12 VM (which has 4.7.3 compiler)
and then copy the tar file that gets generated to machineA and start our app server. After the start, the above line of code prints out value like this in machineA:
1439944652967
Now we also compile our same app server code using make on another Ubuntu 14 VM (which has 4.8.2 compiler)
and then copy the tar file that gets generated to machineB and start our app server. After the start, the above line of code prints out value like this in machineB:
10011360
You see the difference right? I am confuse why this is the difference, I am not able to understand this? All the code and everything is same. Does anyone have any explanations about this and how can I fix it?
If needed, I can try adding some debug code to see what's wrong to figure out this issue.
I'm afraid there's been some confusion over what std::steady_clock
is.
time_since_epoch
gives the duration since the beginning of the clock, not necessarily the Unix epoch. steady_clock
only guarantees to be monotonically increasing. This means that steady_clock
will always be moving forward and that it will never decrease.
There is no guarantee about steady_clock
representing anything meaningful. It can be the duration since the beginning of the program execution, the duration that the computer has been turned on, the duration since the most recent Tuesday, or pretty much anything as long as it continues to move forward.
In other words, steady_clock
is not actually all that useful to tell real world time. It is only useful to measure the passage of time. It's uses could include any situation in which you have point in time A and point in time B and you're curious about the duration between them: benchmarking, progress estimates, etc.
If you're looking for real world time, you should look into std::system_clock
, a clock that represents the time of the system (i.e. the operating system's time). It's great for telling time, but it's pretty pretty useless for measuring differentials since it is not guaranteed to be monotonic and is almost certainly not given daylight saving time, users adjusting their clocks, and other events that can alter real world time.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With