I am seeing code in C++ under windows.
It is mentioned that 1 tick is equal to 100 nanoseconds. Is this specific to windows? or Is this any generic standard, if it is what is name of the standard? Is this same on other OS also?
Reason for asking for above question I have to write platform independent code, if it is windows specific I have to add #ifdef WIN32 for this part of code.
This is microsoft specific: look here
The smallest unit of time is the tick, which is equal to 100 nanoseconds. A tick can be negative or positive.
In Linux systems you can use high granularity timers to get to 100 nanosecond accuracy but you definitely need to handle them separately.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With