Here is what I see in the log:
16:33:20.236: Call is Dialing
16:33:21.088: AVAudioSessionInterruptionNotification
16:33:21.450: AVAudioSessionRouteChangeNotification
16:33:21.450: ....change reason CategoryChange
16:33:21.539: AVAudioEngineConfigurationChangeNotification
16:33:21.542: Starting Audio Engine
16:33:23.863: AVAudioSessionRouteChangeNotification
16:33:23.863: ....change reason OldDeviceUnavailable
16:33:23.860 ERROR: [0x100a70000] AVAudioIONodeImpl.mm:317: ___ZN13AVAudioIOUnit11GetHWFormatEjPj_block_invoke: required condition is false: hwFormat
*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: hwFormat'
I've subscribed for both AVAudioEngineConfigurationChangeNotification
, AVAudioSessionInterruptionNotification
:
@objc private func handleAudioEngineConfigurationChangeNotification(notification: NSNotification) {
println2(notification.name)
makeEngineConnections()
startEngine()
}
@objc private func handleAudioSessionInterruptionNotification(notification: NSNotification) {
println2(notification.name)
if let interruptionType = AVAudioSessionInterruptionType(rawValue: notification.userInfo?[AVAudioSessionInterruptionTypeKey] as! UInt) {
switch interruptionType {
case .Began:
audioPlayerNode.stop()
case .Ended:
if let interruptionOptionValue = notification.userInfo?[AVAudioSessionInterruptionOptionKey] as? UInt {
let interruptionOption = AVAudioSessionInterruptionOptions(interruptionOptionValue)
if interruptionOption == AVAudioSessionInterruptionOptions.OptionShouldResume {
AVAudioSession.sharedInstance().setActive(true, error: nil)
startEngine()
}
}
}
}
}
func startEngine() {
println2("Starting Audio Engine")
var error: NSError?
if !audioEngine.running {
audioEngine.startAndReturnError(&error)
if let error = error {
println2("Error initializing Audio Engine: " + error.localizedDescription)
}
}
}
private func makeEngineConnections() {
let mainMixer = audioEngine.mainMixerNode
audioEngine.connect(audioPlayerNode, to: audioEqNode, format: mainMixer.outputFormatForBus(0))
audioEngine.connect(audioEqNode, to: audioTimePitch, format: mainMixer.outputFormatForBus(0))
audioEngine.connect(audioTimePitch, to: mainMixer, format: mainMixer.outputFormatForBus(0))
}
but it doesn't seem working.
How to avoid this crash?
I suspect the problem is here:audioEngine.connect(audioTimePitch, to: mainMixer, format: mainMixer.outputFormatForBus(0))
The documentation says:
This method calls connect:to:fromBus:toBus:format: using bus 0 for the source audio node, and bus 0 for the destination audio node, except in the case of a destination which is a mixer, in which case the destination is the mixer’s nextAvailableInputBus.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With