Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is it safe to store milliseconds since Epoch in uint32

Tags:

c++

I'm currently rewriting some old code and came across this:

gettimeofday(&tv, NULL);
unsigned int t = tv.tv_sec * 1000 + tv.tv_usec / 1000;

This really looks like they're trying to store the milliseconds since Epoch in an uint32. And for sure I thought that this would not fit so I did some testing.

#include <sys/time.h>
#include <stdint.h>

int main() {
    struct timeval tv;
    gettimeofday(&tv, nullptr);
    uint32_t t32 = tv.tv_sec * 1000 + tv.tv_usec / 1000;
    int64_t t64 = tv.tv_sec * 1000 + tv.tv_usec / 1000;
    return 0;
}

And I was kind of right:

(gdb) print t32
$1 = 1730323142
(gdb) print t64
$2 = 1423364498118

So I guess it's not safe what they're doing. But what are they doing and why are they doing this and what does actually happen? (in this example 10 bits from the left will be lost, they only care about the diffs) Do they still keep millisecond precision? (yes) Note they're sending this "timestamp" over network and still use it for calculation.

like image 946
noob Avatar asked Feb 08 '15 03:02

noob


2 Answers

No, it is not "safe": it sacrifices portability, accuracy, or both.

It is portable if you only care about the low bits, e.g. if you're sending these times on the network and then diffing them on the other side with a maximum difference of about four million seconds (46 days).

It is accurate if you only run this code on systems where int is 64 bits. There are some machines like that, but not many.

like image 166
John Zwinck Avatar answered Oct 06 '22 00:10

John Zwinck


It is safer? I could answer no and yes. I said no because, at this day, we are using almost all bits (from an 32 bit number) to account the epoch since january, 1970. When you multiply by 1000 (dec) you are, almost, rotating all bit to left by something closer to 10 bits, which means losing precision.

I could say yes to your answer too. At the end of your question you said that this number is being used to taking account of the timestamp of packet in a network. The question is: How long is it time to live expect to wear off? 10 years? 10 days, 10 seconds? Losing 10 bits in precision of miliseconds will give you a large amount of time to do your calculations between two packets with precision of milisecond, which I guess is what you want

like image 26
Amadeus Avatar answered Oct 06 '22 01:10

Amadeus