Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Can't play audio recorded from voice using AVCaptureAudioDataOutputSampleDelegate

I have been googling and researching for days but I can't seem to get this to work and I can't find any solution to it on the internet.

I am trying to capture my voice using the microphone and then playing it through the speakers.

Here is my code:

class ViewController: UIViewController, AVAudioRecorderDelegate, AVCaptureAudioDataOutputSampleBufferDelegate {

var recordingSession: AVAudioSession!
var audioRecorder: AVAudioRecorder!
var captureSession: AVCaptureSession!
var microphone: AVCaptureDevice!
var inputDevice: AVCaptureDeviceInput!
var outputDevice: AVCaptureAudioDataOutput!

override func viewDidLoad() {
    super.viewDidLoad()

    recordingSession = AVAudioSession.sharedInstance()

    do{
        try recordingSession.setCategory(AVAudioSessionCategoryPlayAndRecord)
        try recordingSession.setMode(AVAudioSessionModeVoiceChat)
        try recordingSession.setPreferredSampleRate(44000.00)
        try recordingSession.setPreferredIOBufferDuration(0.2)
        try recordingSession.setActive(true)

        recordingSession.requestRecordPermission() { [unowned self] (allowed: Bool) -> Void in
            DispatchQueue.main.async {
                if allowed {

                    do{
                        self.microphone = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio)
                        try self.inputDevice = AVCaptureDeviceInput.init(device: self.microphone)

                        self.outputDevice = AVCaptureAudioDataOutput()
                        self.outputDevice.setSampleBufferDelegate(self, queue: DispatchQueue.main)

                        self.captureSession = AVCaptureSession()
                        self.captureSession.addInput(self.inputDevice)
                        self.captureSession.addOutput(self.outputDevice)
                        self.captureSession.startRunning()
                    }
                    catch let error {
                        print(error.localizedDescription)
                    }
                }
            }
        }
    }catch let error{
        print(error.localizedDescription)
    }
}

And the callback function:

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {

    var audioBufferList = AudioBufferList(
        mNumberBuffers: 1,
        mBuffers: AudioBuffer(mNumberChannels: 0,
        mDataByteSize: 0,
        mData: nil)
    )

    var blockBuffer: CMBlockBuffer?

    var osStatus = CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(

        sampleBuffer,
        nil,
        &audioBufferList,
        MemoryLayout<AudioBufferList>.size,
        nil,
        nil,
        UInt32(kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment),
        &blockBuffer
    )

    do {
        var data: NSMutableData = NSMutableData.init()
        for i in 0..<audioBufferList.mNumberBuffers {

            var audioBuffer = AudioBuffer(
                 mNumberChannels: audioBufferList.mBuffers.mNumberChannels,
                 mDataByteSize: audioBufferList.mBuffers.mDataByteSize,
                 mData: audioBufferList.mBuffers.mData
            )

            let frame = audioBuffer.mData?.load(as: Float32.self)
            data.append(audioBuffer.mData!, length: Int(audioBuffer.mDataByteSize))

        }

        var dataFromNsData = Data.init(referencing: data)
        var avAudioPlayer: AVAudioPlayer = try AVAudioPlayer.init(data: dataFromNsData)
        avAudioPlayer.prepareToPlay()
        avAudioPlayer.play()
        }
    }
    catch let error {
        print(error.localizedDescription)
        //prints out The operation couldn’t be completed. (OSStatus error 1954115647.)
}

Any help with this would be amazing and it would probably help a lot of other people as well since lots of incomplete swift versions of this is out there.

Thank you.

like image 608
nullforlife Avatar asked Sep 08 '16 14:09

nullforlife


1 Answers

You were very close! You were capturing audio in the didOutputSampleBuffer callback, but that's a high frequency callback so you were creating a lot of AVAudioPlayers and passing them raw LPCM data, while they only know how to parse CoreAudio file types and then they were going out of scope anyway.

You can very easily play the buffers you're capturing with AVCaptureSession using AVAudioEngine's AVAudioPlayerNode, but at that point you may as well use AVAudioEngine to record from the microphone too:

import UIKit
import AVFoundation

class ViewController: UIViewController {
    var engine = AVAudioEngine()

    override func viewDidLoad() {
        super.viewDidLoad()

        let input = engine.inputNode!
        let player = AVAudioPlayerNode()
        engine.attach(player)

        let bus = 0
        let inputFormat = input.inputFormat(forBus: bus)
        engine.connect(player, to: engine.mainMixerNode, format: inputFormat)

        input.installTap(onBus: bus, bufferSize: 512, format: inputFormat) { (buffer, time) -> Void in
            player.scheduleBuffer(buffer)
        }

        try! engine.start()
        player.play()
    }
}
like image 59
Rhythmic Fistman Avatar answered Oct 03 '22 05:10

Rhythmic Fistman