I'm making an app that records (AVCaptureSession
) and plays (AVPlayerLayer
) video. I'd like to be able to do this without pausing background audio from other apps and I'd like the playback to respect the mute switch.
In the AppDelegate I have set AVAudioSessionCategoryAmbient
, according to the docs this should:
The category for an app in which sound playback is nonprimary—that is, your app can be used successfully with the sound turned off.
This category is also appropriate for “play along” style apps, such as a virtual piano that a user plays while the Music app is playing. When you use this category, audio from other apps mixes with your audio. Your audio is silenced by screen locking and by the Silent switch (called the Ring/Silent switch on iPhone).
This perfectly describes the behavior I'm looking for. But it doesn't work.
I know it's set because if I try print(AVAudioSession.sharedInstance().category)
in any view controller it returns AVAudioSessionCategoryAmbient
.
Any ideas? I'm using Swift but even a vague direction to look in would be appreciated.
If you have a microphone input, an AVCapture session—by default—will set your apps AVAudioSession to AVAudioSessionCategoryPlayAndRecord
. You've got to tell it not to:
AVCaptureSession.automaticallyConfiguresApplicationAudioSession = false
Doing this, however, just froze the app. Because unfortunately, AVAudioSessionCategoryAmbient
just doesn't work with AVCaptureSession
.
The solution is to set your apps AVAudioSession
to AVAudioSessionCategoryPlayAndRecord
with options:
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, withOptions: [.MixWithOthers, .AllowBluetooth, .DefaultToSpeaker])
try AVAudioSession.sharedInstance().setActive(true)
} catch let error as NSError {
print(error)
}
.MixWithOthers
was kind of the most important one. That let the audio from other apps play. But it switched it to coming out of the earpiece which was super odd (I thought it was getting ducked at first). So .DefaultToSpeaker
moves it to the bottom speaker and .AllowBluetooth
lets you keep bluetooth audio coming out of headphones but also enables a bluetooth mic. Not sure if this can be refined anymore but they seemed like all the relevant options.
During recording, you set your AVAudioSession
to AVAudioSessionCategoryPlayAndRecord
, but that doesn't respect the mute switch.
Because you can't set AVAudioSessionCategoryAmbient
when there's a microphone input. The trick is to remove the mic from the AVCaptureSession
, then set the AVAudioSession
to AVAudioSessionCategoryAmbient
:
do {
captureSession.removeInput(micInput)
try AVAudioSession.sharedInstance().setActive(false)
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryAmbient)
try AVAudioSession.sharedInstance().setActive(true)
} catch let error as NSError { print(error) }
Once you have finished playback and need to go back to recording, you need to set AVAudioSessionCategoryPlayAndRecord
again (with options again so the background audio continues):
do {
try AVAudioSession.sharedInstance().setActive(false)
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, withOptions: [.MixWithOthers, .AllowBluetooth, .DefaultToSpeaker])
try AVAudioSession.sharedInstance().setActive(true)
} catch let error as NSError { print(error) }
captureSession.automaticallyConfiguresApplicationAudioSession = false
captureSession.addInput(micInput!)
The first line in the do
block was the thing that had me caught up for a long time. I didn't need to set the audio session to inactive to switch to AVAudioSessionCategoryAmbient
but it was pausing background audio when coming back to AVAudioSessionCategoryPlayAndRecord
.
EDIT: This may be only relevant for those using AVAssetWriter to record video using sample buffers — which is orders of magnitudes better to manage when directly manipulating and rendering frame-by-frame outputs from the camera.
I was struggling with this for a while because I was building a complex app that had to delegate between (1) playing video (with audio from other apps) and (2) playing and recording video (with audio from other apps). In my context, if you had an AVPlayer object playing video, as well as an AVCaptureSession recording video from an input device, you had to add the following before playing the video:
do {
// MARK: - AVAudioSession
try AVAudioSession.sharedInstance().setActive(false, options: [])
try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default, options: [.mixWithOthers])
try AVAudioSession.sharedInstance().setActive(true, options: [])
} catch let error {
print("\(#file)/\(#line) - Error setting the AVAudioSession for mixing audio play back: \(error.localizedDescription as Any).")
}
Next, to record video with/without audio from other apps and or an AVPlayer object playing video with audio you had to do the following:
// Prepare the audio session to allow simultaneous audio-playback while recording video
do {
// MARK: - AVAudioSession
try AVAudioSession.sharedInstance().setActive(false, options: [.notifyOthersOnDeactivation])
} catch let error {
print("\(#file)/\(#line) - Error deactivating AVAudioSession: \(error.localizedDescription as Any).")
}
do {
// MARK: - AVAudioSession
try AVAudioSession.sharedInstance().setCategory(.playAndRecord, mode: .default, options: [.mixWithOthers, .defaultToSpeaker, .allowBluetoothA2DP])
} catch let error {
print("\(#file)/\(#line) - Error setting the AVAudioSession category: \(error.localizedDescription as Any).")
}
do {
// MARK: - AVAudioSession
try AVAudioSession.sharedInstance().setActive(true, options: [])
} catch let error {
print("\(#file)/\(#line) - Error activating the AVAudioSession: \(error.localizedDescription as Any).")
}
// Setup the audio device input
setupAudioDeviceInput { (error: NGError?) in
if error == nil {
print("\(#file)/\(#line) - Successfully added audio-device input.")
} else {
print("\(#file)/\(#line) - Error: \(error?.localizedDescription as Any)")
}
}
// Setup the audio data output
setupAudioDataOutput { (error: NGError?) in
if error == nil {
print("\(#file)/\(#line) - Successfully added audio-data output.")
} else {
print("\(#file)/\(#line) - Error: \(error?.localizedDescription as Any)")
}
}
/// (1) Initializes the AVCaptureDevice for audio, (2) creates the associated AVCaptureDeviceInput for audio, and (3) adds the audio device input to the AVCaptureSession.
/// - Parameter error: An optional NGError object returned if the setup for the audio device input fails.
func setupAudioDeviceInput(completionHandler: @escaping (_ error: NGError?) -> ()) {
// Setup the AVCaptureDevice for audio input
self.audioCaptureDevice = AVCaptureDevice.default(for: .audio)
// Unwrap the AVCaptureDevice for audio input
guard audioCaptureDevice != nil else {
print("\(#file)/\(#line) - Couldn't unwrap the AVCaptureDevice for audio.")
return
}
do {
/// Create the AVCaptureDeviceInput for the AVCaptureDevice for audio
self.audioCaptureDeviceInput = try AVCaptureDeviceInput(device: audioCaptureDevice)
// Add the AVCaptureDeviceInput for the audio
if self.captureSession.canAddInput(self.audioCaptureDeviceInput) {
self.captureSession.addInput(self.audioCaptureDeviceInput)
// Pass the values in the completion handler
completionHandler(nil)
} else {
// MARK: - NGError
let error = NGError(message: "\(#file)/\(#line) - Couldn't add the AVCaptureDeviceInput to the capture session.")
// Pass the values in the completion handler
completionHandler(error)
}
} catch let error {
// MARK: - NGError
let error = NGError(message: "\(#file)/\(#line) - Error setting up audio input for the capture session \(error.localizedDescription as Any)")
// Pass the values in the completion handler
completionHandler(error)
}
}
/// (1) Initializes the AVCaptureAudioDataOutput, (2) sets its AVCaptureAudioDataOutputSampleBufferDelegate and adds it to the AVCaptureSession.
/// - Parameter error: An optional NGError object returned if the setup fails.
func setupAudioDataOutput(completionHandler: @escaping (_ error: NGError?) -> ()) {
// Setup the AVCaptureAudioDataOutput
self.audioDataOutput = AVCaptureAudioDataOutput()
// Determine if the AVCaptureSession can add the audio data output
if self.captureSession.canAddOutput(self.audioDataOutput) {
// Setup the AVCaptureAudioDataOutput's AVCaptureAudioDataOutputSampleBufferDelegate and add it to the AVCaptureSession
self.audioDataOutput.setSampleBufferDelegate(self, queue: self.outputQueue)
self.captureSession.addOutput(self.audioDataOutput)
// Pass the values to the completion handler
completionHandler(nil)
} else {
// MARK: - NGError
let error = NGError(message: "\(#file)/\(#line) - Couldn't add the AVCaptureAudioDataOutput to the AVCaptureSession.")
// Pass the values to the completion handler
completionHandler(error)
}
}
Once you prepare all that, you want to set up and configure your AVAssetWriter to begin writing the video data to a file.
// Remove the audio device input and its audio data output.
if let audioInput = audioCaptureDeviceInput, let audioOutput = audioDataOutput {
captureSession.removeInput(audioInput)
captureSession.removeOutput(audioOutput)
} else {
print("\(#file)/\(#line) - Couldn't remove the AVCaptureDeviceInput for audio and the AVCaptureAudioDataOutput.")
}
Make sure you mark the AVAssetWriter's video and audio input as finished after you've finished processing the video/audio data.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With