Given a timestamp string coming from another machine A to machine B (for example, hh:mm:ss.fff
) and assuming both machines' clocks to be synchronised, how can I make machine B calculate the timespan between its clock and the time in the string coming from machine A?
I've tried comparing with DateTime.Now.Ticks
, but the resolution seems to be 10-20 ms. I would like to calculate closer to a 1 ms resolution.
For some time now, I've had success in using Stopwatch.GetTimestamp()
for high-resolution timing, but keep in mind it's not that simple since I only have the string available in machine B, and no great way to calibrate Stopwatch.GetTimestamp
with an actual system time.
DateTime.Now uses the underlying OS's date, which has resolution of about 15 ms, or even worse with pre-Windows XP systems. The best you can get is the High-Resolution-Timer available exactly in the Stopwatch and its ticks/clocks.
You will have to first synchronize the clocks, but that DOES NOT MEAN "set the system's date". Synchronizing clocks means to calculate the difference between the readings, optionally validating if it is constant as the time flows (that is, checking whether none of the clocks counts faster than others) and then using the calculated offset to .. offset the readings so that the "actual values" are relative on the same time-base -- which is of your choice.
Google for "clock synchronization algorithms" to read a lot more about it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With