Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Firing events at microsecond resolution for midi sequencer

Is there a way to fire events in C# at a resolution of a few microseconds?

I am building a MIDI sequencer, and it requires an event to be fired every MIDI tick, which will then play any note registered at that time.

At 120 beats per minute and at a resolution of 120 ppqn (pulses per beat/quarter note), that event should fire every 4.16666 milliseconds. Modern sequencers have higher resolutions such as 768ppqn which would require that event to be fired every 651 microseconds.

The best resolution for short-timed events I have found is of 1 millisecond. How can I go beyond that?

This problem must have already been solved by any C# MIDI sequencer or MIDI file player. Maybe am I just not looking at the problem through the right angle.

Thank you for your help.

like image 295
Brice Avatar asked Aug 04 '10 00:08

Brice


3 Answers

It is not possible to have events accurately fired on microsecond intervals in .NET.

In fact because Windows itself is not a real time OS, performing anything with 100% accuracy to certain microseconds, in user mode software, is pretty much impossible.

For more information on why this is so difficult see the MSDN magazine article: Implement a Continuously Updating, High-Resolution Time Provider for Windows. While it talks about Windows NT, this still generally applies to later versions of Windows.

The conclusion of this article sums it up well:

If you now think that you can obtain the system time with an almost arbitrary precision here, just a slight warning: don't forget the preemptiveness of a multitasking system such as Windows NT. In the best case, the time stamp you'll get is off by only the time it takes to read the performance counter and transform this reading into an absolute time. In the worst cases, the time elapsed could easily be in the order of tens of milliseconds.

Although this might indicate you went through all of this for nothing, rest assured that you didn't. Even executing the call to the Win32 API GetSystemTimeAsFileTime (or gettimeofday under Unix) is subject to the same conditions, so you are actually doing no worse than that. In a majority of the cases, you will have good results. Just don't perform anything requiring real-time predictability on the basis of time stamps in Windows NT.

like image 115
Ash Avatar answered Oct 11 '22 10:10

Ash


Most midi sequencers/midi players will either convert large blocks of time to waveform (for playing through computer speakers) or take a large block of MIDI instructions (for an external device attached to a MIDI port). Either way, a block of data is copied to the sound card, and the sound card takes care of exact timing.

You might want to look at the Multimedia Control APIs.

See this post over at the Microsoft discussion forum

like image 23
Ben Voigt Avatar answered Oct 11 '22 10:10

Ben Voigt


I think you are unlikely to get exactly the correct resolution from a timer. A better approach would be to use the 1ms accurate timer, and when it fires, to check which MIDI events are pending and to fire them.

So, the MIDI events go in a sorted queue, you peek the first one, and set the timer to fire as close as possible to that time. When the timer fires, consume all events from the queue that have elapsed, until you encounter a future event. Calculate time to this event. Reschedule timer.

Of course, if you are outputting to your soundcard, the approach is fundamentally different, and you should be counting samples for all your timings.

like image 22
spender Avatar answered Oct 11 '22 11:10

spender