Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using CoreMIDI input with AVAudioUnit

I'm trying to achieve something that seems like it should be simple: listen for MIDI messages in a Mac app, and use these to play notes from an existing AVAudioUnit instrument.

Hypothesis: I need to write a bridge between the MIDIReadBlock associated with my CoreMIDI client (via MIDIInputPortCreateWithBlock with a MIDIClientRef) and the AUScheduleMIDIEventBlock I can get from my AVAudioUnit's AUAudioUnit (via scheduleMIDIEventBlock). This seems more complex than it should be though, since I'll be mucking around with raw MIDI data – I feel like audio units must support some sort of MIDI abstraction that's easy to use with CoreMIDI, but I can't find any related examples of this. Perhaps there's a way to use MIDIOutputPortCreate with an AV/AUAudioUnit?

What I'm looking for is a working example of piping MIDI input directly into an audio unit (ideally using Swift 3), but if you know of any related resources that are relatively current, please share those links too. The sparsity of documentation for these APIs is pretty frustrating. Thanks!

like image 849
man1 Avatar asked Mar 20 '17 00:03

man1


1 Answers

In your MIDIReadBlock loop through the packets received. Based on the MIDI status of each packet send the appropriate device event (e.g. note on) to your audioUnit.

e.g.

osstatus = MusicDeviceMIDIEvent(audioUnit, midiCmd...
like image 66
Gene De Lisa Avatar answered Oct 31 '22 10:10

Gene De Lisa