I am not sure if this question belongs on StackOverflow but here it is.
I need to generate a timestamp using C# for some data which is to be transferred from one party to another party and I need to know what is the worst case precision of the system clock in all operating system (Windows, Linux and Unix)? What I need is to figure out the precision such that all operating systems are able to validate this timestamp.
As an example the clock's resolution for Windows Vista operating systems is approximately 10-15 milliseconds.
Are you looking to generate something like a unix timestamp for the data? Or to find a timestamp that wont collide with an existing file? If its the later you could always use ticks.
The problem with any "long" timestamp is that it will be relative to the machine generating it but wont guarantee non-collision on the other system, as the clocks can be set differently (not float, but actually be set differently).
If the data is secure/sensitive and you are looking at a time-based mechanism for sync-ing keys (ALA Kerberos) I would not suggest rolling your own as there are many obstacles to overcome especially in sync-ing systems and keeping them in sync.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With