How do I get a microseconds timestamp in C?
I'm trying to do:
struct timeval tv; gettimeofday(&tv,NULL); return tv.tv_usec;
But this returns some nonsense value that if I get two timestamps, the second one can be smaller or bigger than the first (second one should always be bigger). Would it be possible to convert the magic integer returned by gettimeofday to a normal number which can actually be worked with?
A microsecond is a unit of time in the International System of Units (SI) equal to one millionth (0.000001 or 10−6 or 1⁄1,000,000) of a second. Its symbol is μs, sometimes simplified to us when Unicode is not available.
The struct timeval structure represents an elapsed time. It is declared in `sys/time. h' and has the following members: long int tv_sec. This represents the number of whole seconds of elapsed time.
You need to add in the seconds, too:
unsigned long time_in_micros = 1000000 * tv.tv_sec + tv.tv_usec;
Note that this will only last for about 232/106 =~ 4295 seconds, or roughly 71 minutes though (on a typical 32-bit system).
You have two choices for getting a microsecond timestamp. The first (and best) choice, is to use the timeval
type directly:
struct timeval GetTimeStamp() { struct timeval tv; gettimeofday(&tv,NULL); return tv; }
The second, and for me less desirable, choice is to build a uint64_t out of a timeval
:
uint64_t GetTimeStamp() { struct timeval tv; gettimeofday(&tv,NULL); return tv.tv_sec*(uint64_t)1000000+tv.tv_usec; }
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With