Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I use Apple's Core Audio C API to create a simple, real-time I/O stream on OS X?

After spending quite a while traversing the extensive Core Audio docs maze, I'm still unsure of what part of the C API I should be using to create a basic audio sample I/O stream in OS X.

When I say "I/O stream" I mean a low-latency stream that is spawned for a specific audio device (with params such as sample rate, number of channels, bit depth, etc) and receives/requests buffers of interleaved audio samples to be played back by the device with.

I would really appreciate it if someone could point me towards the header and associated functions that I need to achieve this (perhaps even an example) :) Thanks!

PS: Normally I would use PortAudio to achieve this, however in this case I'm interested in accessing the Core Audio framework directly in order to assist a friend in creating a purely Rust portable audio platform. Also, I've posted this question to the Apple developer forums but have not yet received a response so I thought I'd try here. If there is a more suitable exchange/forum to ask at, please let me know.

like image 824
mindTree Avatar asked Mar 18 '23 01:03

mindTree


2 Answers

The easiest way to accomplish this is to instantiate an output audio unit using AudioComponentInstanceNew. Once you create the instance, install a render callback that will provide the audio data (in real time). Apple has two technical notes that may help: TN2097 and TN2091. The code to do this involves a bit of boiler plate and ends up being a bit long. Here is an example of how to create an output audio unit for the default output device:

AudioComponent comp;
AudioComponentDescription desc;
AudioComponentInstance auHAL;

//There are several different types of Audio Units.
//Some audio units serve as Outputs, Mixers, or DSP
//units. See AUComponent.h for listing
desc.componentType = kAudioUnitType_Output;

//Every Component has a subType, which will give a clearer picture
//of what this components function will be.
desc.componentSubType = kAudioUnitSubType_HALOutput;

 //all Audio Units in AUComponent.h must use
 //"kAudioUnitManufacturer_Apple" as the Manufacturer
desc.componentManufacturer = kAudioUnitManufacturer_Apple;
desc.componentFlags = 0;
desc.componentFlagsMask = 0;

//Finds a component that meets the desc spec's
comp = AudioComponentFindNext(NULL, &desc);
if (comp == NULL) exit (-1);

 //gains access to the services provided by the component
AudioComponentInstanceNew(comp, &auHAL);

There are third party libraries available to simplify the process. One that seems to be popular is Novocaine, although I haven't used it personally.

like image 108
sbooth Avatar answered Apr 09 '23 00:04

sbooth


Unlike iOS, on a Mac there in no default IO or RemoteIO unit, the input and output Audio Units might not be the same, and thus must be determined and configured separately, with 2 different callbacks and something like a circular buffer in between. You may need to list or search though all the available audio components to find the units capable of input.

The OS X framework headers required may included:

  • AudioToolbox/AudioToolbox.h
  • AudioUnit/AudioUnit.h
  • AudioUnit/AUComponent.h
  • AudioUnit/AudioComponent.h
like image 30
hotpaw2 Avatar answered Apr 08 '23 23:04

hotpaw2