I've a program to calculate the latency of an object in a pub-sub model. I've used the following function for timestamp:
uint64_t GetTimeStamp() {
struct timeval tv;
gettimeofday(&tv,NULL);
return tv.tv_sec*(uint64_t)1000000+tv.tv_usec;
}
The latency is measured as timestamp difference in publisher and subscriber. So, I'm concerned about the unit of the latency measured. Is it in seconds or microseconds??
The gettimeofday() function retrieves the current Coordinated Universal Time (UTC) and places it in the timeval structure pointed to by tp . If tzp is not NULL, the time zone information is returned in the time zone structure pointed to by tzp .
long int tv_sec. This represents the number of whole seconds of elapsed time. long int tv_usec. This is the rest of the elapsed time (a fraction of a second), represented as the number of microseconds.
The timeval
structure has tv_sec
, which gives you the absolute value for the seconds, and tv_usec
, which gives you the remaining fraction in micro seconds.
So, you could get the resolution in micro seconds.
For more information, http://www.gnu.org/software/libc/manual/html_node/Elapsed-Time.html
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With