I've just seen this question, where one of the answers indicates that System.Diagnostics.Stopwatch should only be used for diagnosing performance and not in production code.
In that case, what would be the best way to get precision timing in .NET? I'm currently in the early stages of building a very simple MIDI sequencer using the MIDI-out functionality of NAudio. I'd like to be able to send MIDI messages out aligned to (say) 1/10s with as little jitter as possible. Is this feasible, or will things like context-switching ruin my day?
I currently have some code in a console app that continuously calls Stopwatch
and calculates the jitter when generating a stream of 1/16th-notes at 150bpm. The jitter is very low in this situation. However, I'll be moving this off to another thread, so I don't know if that will remain the case.
First, lets take a look at precision: The DateTime type is basically just a 64 bit integer that counts “ticks”. One tick is 100 nanoseconds (or 0.0001 milliseconds) long (MSDN). So DateTime 's precision can be up to 0.0001 milliseconds.
From MSDN you'll find that DateTime. Now has an approximate resolution of 10 milliseconds on all NT operating systems. The actual precision is hardware dependent.
The same goes for DateTime ; Hans Passant explains: DateTime. UtcNow is accurate to 15.625 milliseconds and stable over very long periods thanks to the time service updates.
If you don't mind P/Invoke, you can use QueryPerformanceCounter: http://www.eggheadcafe.com/articles/20021111.asp
You could also use the native streaming MIDI API. I don't think its in NAudio but you could look at the C# Midi Toolkit to see if that one supports it. Otherwise you have two examples of how to do native MIDI API P/Invoke and can roll your own...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With