Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Matching Input & Output Hardware Settings for AVAudioEngine

I am trying to build a very simple audio effects chain using Core Audio for iOS. So far I have implemented an EQ - Compression - Limiter chain which works perfectly fine in the simulator. However on device, the application crashes when connecting nodes to the AVAudioEngine due to an apparent mismatch in the input and output hardware formats.

'com.apple.coreaudio.avfaudio', reason: 'required condition is false:
 IsFormatSampleRateAndChannelCountValid(outputHWFormat)'

Taking a basic example, my Audio Graph is as follows.

Mic -> Limiter -> Main Mixer (and Output)

and the graph is populated using

engine.connect(engine.inputNode!, to: limiter, format: engine.inputNode!.outputFormatForBus(0))
engine.connect(limiter, to: engine.mainMixerNode, format: engine.inputNode!.outputFormatForBus(0))

which crashes with the above exception. If I instead use the limiter's format when connecting to the mixer

engine.connect(engine.inputNode!, to: limiter, format: engine.inputNode!.outputFormatForBus(0))
engine.connect(limiter, to: engine.mainMixerNode, format: limiter.outputFormatForBus(0))

the application crashes with an kAudioUnitErr_FormatNotSupported error

'com.apple.coreaudio.avfaudio', reason: 'error -10868'

Before connecting the audio nodes in the engine, inputNode has 1 channel and a sample rate of 44.100Hz, while the outputNode has 0 channels and a sample rate of 0Hz (deduced using outputFormatForBus(0) property). But this could be because there is no node yet connected to the output mixer? Setting the preferred sample rate on AVAudioSession made no difference.

Is there something that I am missing here? I have Microphone access (verified using AVAudioSession.sharedInstance().recordPermission()), and I have set the AVAudioSession mode to record (AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryRecord)).

The limiter is an AVAudioUnitEffect initialized as follows:

let limiter = AVAudioUnitEffect( audioComponentDescription:
                AudioComponentDescription(
                    componentType: kAudioUnitType_Effect,
                    componentSubType: kAudioUnitSubType_PeakLimiter,
                    componentManufacturer: kAudioUnitManufacturer_Apple,
                    componentFlags: 0,
                    componentFlagsMask: 0) )
engine.attachNode(limiter)

and engine is a global, class variable

var engine = AVAudioEngine()

As I said, this works perfectly fine using the simulator (and Mac's default hardware), but continually crashes on various iPads on iOS8 & iOS9. I have a super basic example working which simply feeds the mic input to a player to the output mixer

do {
     file = try AVAudioFile(forWriting: NSURL.URLToDocumentsFolderForName(name: "test", WithType type: "caf")!, settings: engine.inputNode!.outputFormatForBus(0).settings)
} catch {}
engine.connect(player, to: engine.mainMixerNode, format: file.processingFormat)

Here the inputNode has 1 channel and 44.100Hz sampling rate, while the outputNode has 2 channels and 44.100Hz sampling rate, but no mismatching seems to occur. Thus the issue must be the manner in which the AVAudioUnitEffect is connected to the output mixer.

Any help would be greatly appreciated.

like image 644
DeFunc Art Avatar asked Jan 26 '16 10:01

DeFunc Art


1 Answers

This depends on some factors outside of the code you've shared, but it's possible you're using the wrong AVAudioSession category.

I ran into this same issue, under some slightly different circumstances. When I was using AVAudioSessionCategoryRecord as the AVAudioSession category, I ran into this same issue when attempting to connect an audio tap. I not only received that error, but my AVAudioEngine inputNode showed an outputFormat with 0.0 sample rate.

Changing it to AVAudioSessionCategoryPlayAndRecord I received the expected 44.100Hz sample rate and the issue resolved.

like image 76
jeffro37 Avatar answered Sep 29 '22 22:09

jeffro37