Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do I make AVCaptureSession and AVPlayer respect AVAudioSessionCategoryAmbient?

I'm making an app that records (AVCaptureSession) and plays (AVPlayerLayer) video. I'd like to be able to do this without pausing background audio from other apps and I'd like the playback to respect the mute switch.

In the AppDelegate I have set AVAudioSessionCategoryAmbient, according to the docs this should:

The category for an app in which sound playback is nonprimary—that is, your app can be used successfully with the sound turned off.

This category is also appropriate for “play along” style apps, such as a virtual piano that a user plays while the Music app is playing. When you use this category, audio from other apps mixes with your audio. Your audio is silenced by screen locking and by the Silent switch (called the Ring/Silent switch on iPhone).

This perfectly describes the behavior I'm looking for. But it doesn't work.

I know it's set because if I try print(AVAudioSession.sharedInstance().category) in any view controller it returns AVAudioSessionCategoryAmbient.

Any ideas? I'm using Swift but even a vague direction to look in would be appreciated.

like image 214
Andy Taylor Avatar asked Nov 19 '15 01:11

Andy Taylor


2 Answers

How to Mix Background Audio With an AVCapture Session:

If you have a microphone input, an AVCapture session—by default—will set your apps AVAudioSession to AVAudioSessionCategoryPlayAndRecord. You've got to tell it not to:

AVCaptureSession.automaticallyConfiguresApplicationAudioSession = false

Doing this, however, just froze the app. Because unfortunately, AVAudioSessionCategoryAmbient just doesn't work with AVCaptureSession.

The solution is to set your apps AVAudioSession to AVAudioSessionCategoryPlayAndRecord with options:

do {
    try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, withOptions: [.MixWithOthers, .AllowBluetooth, .DefaultToSpeaker])
    try AVAudioSession.sharedInstance().setActive(true)

} catch let error as NSError {
    print(error)
}

.MixWithOthers was kind of the most important one. That let the audio from other apps play. But it switched it to coming out of the earpiece which was super odd (I thought it was getting ducked at first). So .DefaultToSpeaker moves it to the bottom speaker and .AllowBluetooth lets you keep bluetooth audio coming out of headphones but also enables a bluetooth mic. Not sure if this can be refined anymore but they seemed like all the relevant options.

How to Respect the Mute Switch in Playback:

During recording, you set your AVAudioSession to AVAudioSessionCategoryPlayAndRecord, but that doesn't respect the mute switch.

Because you can't set AVAudioSessionCategoryAmbient when there's a microphone input. The trick is to remove the mic from the AVCaptureSession, then set the AVAudioSession to AVAudioSessionCategoryAmbient:

do {
    captureSession.removeInput(micInput)
    try AVAudioSession.sharedInstance().setActive(false)
    try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryAmbient)
    try AVAudioSession.sharedInstance().setActive(true)
} catch let error as NSError { print(error) }

Once you have finished playback and need to go back to recording, you need to set AVAudioSessionCategoryPlayAndRecord again (with options again so the background audio continues):

do {
    try AVAudioSession.sharedInstance().setActive(false)
    try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, withOptions: [.MixWithOthers, .AllowBluetooth, .DefaultToSpeaker])
    try AVAudioSession.sharedInstance().setActive(true)
} catch let error as NSError { print(error) }

captureSession.automaticallyConfiguresApplicationAudioSession = false
captureSession.addInput(micInput!)

The first line in the do block was the thing that had me caught up for a long time. I didn't need to set the audio session to inactive to switch to AVAudioSessionCategoryAmbient but it was pausing background audio when coming back to AVAudioSessionCategoryPlayAndRecord.

like image 200
Andy Taylor Avatar answered Nov 06 '22 10:11

Andy Taylor


EDIT: This may be only relevant for those using AVAssetWriter to record video using sample buffers — which is orders of magnitudes better to manage when directly manipulating and rendering frame-by-frame outputs from the camera.

I was struggling with this for a while because I was building a complex app that had to delegate between (1) playing video (with audio from other apps) and (2) playing and recording video (with audio from other apps). In my context, if you had an AVPlayer object playing video, as well as an AVCaptureSession recording video from an input device, you had to add the following before playing the video:

do {
        // MARK: - AVAudioSession
        try AVAudioSession.sharedInstance().setActive(false, options: [])
        try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default, options: [.mixWithOthers])
        try AVAudioSession.sharedInstance().setActive(true, options: [])
    } catch let error {
        print("\(#file)/\(#line) - Error setting the AVAudioSession for mixing audio play back: \(error.localizedDescription as Any).")
    }

Next, to record video with/without audio from other apps and or an AVPlayer object playing video with audio you had to do the following:

Prepare the Audio

// Prepare the audio session to allow simultaneous audio-playback while recording video
    do {
        // MARK: - AVAudioSession
        try AVAudioSession.sharedInstance().setActive(false, options: [.notifyOthersOnDeactivation])
    } catch let error {
        print("\(#file)/\(#line) - Error deactivating AVAudioSession: \(error.localizedDescription as Any).")
    }
    do {
        // MARK: - AVAudioSession
        try AVAudioSession.sharedInstance().setCategory(.playAndRecord, mode: .default, options: [.mixWithOthers, .defaultToSpeaker, .allowBluetoothA2DP])
    } catch let error {
        print("\(#file)/\(#line) - Error setting the AVAudioSession category: \(error.localizedDescription as Any).")
    }
    do {
        // MARK: - AVAudioSession
        try AVAudioSession.sharedInstance().setActive(true, options: [])
    } catch let error {
        print("\(#file)/\(#line) - Error activating the AVAudioSession: \(error.localizedDescription as Any).")
    }

Setup the Audio Device Input/Output with the AVCaptureSession

        // Setup the audio device input
    setupAudioDeviceInput { (error: NGError?) in
        if error == nil {
            print("\(#file)/\(#line) - Successfully added audio-device input.")
        } else {
            print("\(#file)/\(#line) - Error: \(error?.localizedDescription as Any)")
        }
    }

    // Setup the audio data output
    setupAudioDataOutput { (error: NGError?) in
        if error == nil {
            print("\(#file)/\(#line) - Successfully added audio-data output.")
        } else {
            print("\(#file)/\(#line) - Error: \(error?.localizedDescription as Any)")
        }
    }

When these methods are broken down, they're essentially the following:

/// (1) Initializes the AVCaptureDevice for audio, (2) creates the associated AVCaptureDeviceInput for audio, and (3) adds the audio device input to the AVCaptureSession.
/// - Parameter error: An optional NGError object returned if the setup for the audio device input fails.
func setupAudioDeviceInput(completionHandler: @escaping (_ error: NGError?) -> ()) {
    // Setup the AVCaptureDevice for audio input
    self.audioCaptureDevice = AVCaptureDevice.default(for: .audio)

    // Unwrap the AVCaptureDevice for audio input
    guard audioCaptureDevice != nil else {
        print("\(#file)/\(#line) - Couldn't unwrap the AVCaptureDevice for audio.")
        return
    }

    do {
        /// Create the AVCaptureDeviceInput for the AVCaptureDevice for audio
        self.audioCaptureDeviceInput = try AVCaptureDeviceInput(device: audioCaptureDevice)

        // Add the AVCaptureDeviceInput for the audio
        if self.captureSession.canAddInput(self.audioCaptureDeviceInput) {
            self.captureSession.addInput(self.audioCaptureDeviceInput)

            // Pass the values in the completion handler
            completionHandler(nil)

        } else {
            // MARK: - NGError
            let error = NGError(message: "\(#file)/\(#line) - Couldn't add the AVCaptureDeviceInput to the capture session.")
            // Pass the values in the completion handler
            completionHandler(error)
        }

    } catch let error {
        // MARK: - NGError
        let error = NGError(message: "\(#file)/\(#line) - Error setting up audio input for the capture session \(error.localizedDescription as Any)")
        // Pass the values in the completion handler
        completionHandler(error)
    }
}

/// (1) Initializes the AVCaptureAudioDataOutput, (2) sets its AVCaptureAudioDataOutputSampleBufferDelegate and adds it to the AVCaptureSession.
/// - Parameter error: An optional NGError object returned if the setup fails.
func setupAudioDataOutput(completionHandler: @escaping (_ error: NGError?) -> ()) {
    // Setup the AVCaptureAudioDataOutput
    self.audioDataOutput = AVCaptureAudioDataOutput()

    // Determine if the AVCaptureSession can add the audio data output
    if self.captureSession.canAddOutput(self.audioDataOutput) {
        // Setup the AVCaptureAudioDataOutput's AVCaptureAudioDataOutputSampleBufferDelegate and add it to the AVCaptureSession
        self.audioDataOutput.setSampleBufferDelegate(self, queue: self.outputQueue)
        self.captureSession.addOutput(self.audioDataOutput)

        // Pass the values to the completion handler
        completionHandler(nil)
    } else {
        // MARK: - NGError
        let error = NGError(message: "\(#file)/\(#line) - Couldn't add the AVCaptureAudioDataOutput to the AVCaptureSession.")
        // Pass the values to the completion handler
        completionHandler(error)
    }
}

Setup AVAssetWriter

Once you prepare all that, you want to set up and configure your AVAssetWriter to begin writing the video data to a file.

Remove Audio Input and Output

        // Remove the audio device input and its audio data output.
    if let audioInput = audioCaptureDeviceInput, let audioOutput = audioDataOutput {
        captureSession.removeInput(audioInput)
        captureSession.removeOutput(audioOutput)
    } else {
        print("\(#file)/\(#line) - Couldn't remove the AVCaptureDeviceInput for audio and the AVCaptureAudioDataOutput.")
    }

Make sure you mark the AVAssetWriter's video and audio input as finished after you've finished processing the video/audio data.

like image 40
saikik Avatar answered Oct 03 '22 11:10

saikik