Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

iOS Audio Units : When is usage of AUGraph's necessary?

I'm totally new to iOS programing (I'm more an Android guy..) and have to build an application dealing with audio DSP. (I know it's not the easiest way to approach iOS dev ;) )

The app needs to be able to accept inputs both from :

1- built-in microphone 2- iPod library

Then filters may be applied to the input sound and the resulting is to be outputed to :

1- Speaker 2- Record to a file

My question is the following : Is an AUGraph necessary in order to be able for example to apply multiple filters to the input or can these different effects be applied by processing the samples with different render callbacks ?

If I go with AUGraph do I need : 1 Audio Unit for each input, 1 Audio Unit for the output and 1 Audio Input for each effect/filter ?

And finally if I don't may I only have 1 Audio Unit and reconfigure it in order to select the source/destination ?

Many thanks for your answers ! I'm getting lost with this stuff...

like image 341
Acacio Martins Avatar asked Jul 18 '11 15:07

Acacio Martins


1 Answers

You may indeed use render callbacks if you so wished to but the built in Audio Units are great (and there are things coming that I can't say here yet under NDA etc., I've said too much, if you have access to the iOS 5 SDK I recommend you have a look).

You can implement the behavior you wish without using AUGraph, however it is recommended you do as it takes care of a lot of things under the hood and saves you time and effort.

Using AUGraph

From the Audio Unit Hosting Guide (iOS Developer Library):

The AUGraph type adds thread safety to the audio unit story: It enables you to reconfigure a processing chain on the fly. For example, you could safely insert an equalizer, or even swap in a different render callback function for a mixer input, while audio is playing. In fact, the AUGraph type provides the only API in iOS for performing this sort of dynamic reconfiguration in an audio app.

Choosing A Design Pattern (iOS Developer Library) goes into some detail on how you would choose how to implement your Audio Unit environment. From setting up the audio session, graph and configuring/adding units, writing callbacks.

As for which Audio Units you would want in the graph, in addition to what you already stated, you will want to have a MultiChannel Mixer Unit (see Using Specific Audio Units (iOS Developer Library)) to mix your two audio inputs and then hook up the mixer to the Output unit.

Direct Connection

Alternatively, if you were to do it directly without using AUGraph, the following code is a sample to hook up Audio units together yourself. (From Constructing Audio Unit Apps (iOS Developer Library))

You can, alternatively, establish and break connections between audio units directly by using the audio unit property mechanism. To do so, use the AudioUnitSetProperty function along with the kAudioUnitProperty_MakeConnection property, as shown in Listing 2-6. This approach requires that you define an AudioUnitConnection structure for each connection to serve as its property value.

/*Listing 2-6*/
AudioUnitElement mixerUnitOutputBus  = 0;
AudioUnitElement ioUnitOutputElement = 0;

AudioUnitConnection mixerOutToIoUnitIn;
mixerOutToIoUnitIn.sourceAudioUnit    = mixerUnitInstance;
mixerOutToIoUnitIn.sourceOutputNumber = mixerUnitOutputBus;
mixerOutToIoUnitIn.destInputNumber    = ioUnitOutputElement;

AudioUnitSetProperty (
    ioUnitInstance,                     // connection destination
    kAudioUnitProperty_MakeConnection,  // property key
    kAudioUnitScope_Input,              // destination scope
    ioUnitOutputElement,                // destination element
    &mixerOutToIoUnitIn,                // connection definition
    sizeof (mixerOutToIoUnitIn)
);
like image 143
DJ Bouche Avatar answered Oct 10 '22 04:10

DJ Bouche