Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Play audio from AVAudioPCMBuffer with AVAudioEngine

I have two classes, MicrophoneHandler, and AudioPlayer. I have managed to use AVCaptureSession to tap microphone data using the approved answer here, and and converted the CMSampleBuffer to NSData using this function:

func sendDataToDelegate(buffer: CMSampleBuffer!)
{
    let block = CMSampleBufferGetDataBuffer(buffer)
    var length = 0
    var data: UnsafeMutablePointer<Int8> = nil

    var status = CMBlockBufferGetDataPointer(block!, 0, nil, &length, &data)    // TODO: check for errors

    let result = NSData(bytesNoCopy: data, length: length, freeWhenDone: false)

    self.delegate.handleBuffer(result)
}

I would now like to play the audio over the speaker by converting the NSData produced above to AVAudioPCMBuffer and play it using AVAudioEngine. My AudioPlayerclass is as follows:

var engine: AVAudioEngine!
var playerNode: AVAudioPlayerNode!
var mixer: AVAudioMixerNode!

override init()
{
    super.init()

    self.setup()
    self.start()
}

func handleBuffer(data: NSData)
{
    let newBuffer = self.toPCMBuffer(data)
    print(newBuffer)

    self.playerNode.scheduleBuffer(newBuffer, completionHandler: nil)
}

func setup()
{
    self.engine = AVAudioEngine()
    self.playerNode = AVAudioPlayerNode()

    self.engine.attachNode(self.playerNode)
    self.mixer = engine.mainMixerNode

    engine.connect(self.playerNode, to: self.mixer, format: self.mixer.outputFormatForBus(0))
}

func start()
{
    do {
        try self.engine.start()
    }
    catch {
        print("error couldn't start engine")
    }

    self.playerNode.play()
}

func toPCMBuffer(data: NSData) -> AVAudioPCMBuffer
{
    let audioFormat = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatFloat32, sampleRate: 8000, channels: 2, interleaved: false)  // given NSData audio format
    let PCMBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity: UInt32(data.length) / audioFormat.streamDescription.memory.mBytesPerFrame)

    PCMBuffer.frameLength = PCMBuffer.frameCapacity

    let channels = UnsafeBufferPointer(start: PCMBuffer.floatChannelData, count: Int(PCMBuffer.format.channelCount))

    data.getBytes(UnsafeMutablePointer<Void>(channels[0]) , length: data.length)

    return PCMBuffer
}

The buffer reaches the handleBuffer:buffer function when self.delegate.handleBuffer(result) is called in the first snippet above.

I am able to print(newBuffer), and see the memory locations of the converted buffers, but nothing comes out of the speakers. I can only imagine something is not consistent between the conversions to and from NSData. Any ideas? Thanks in advance.

like image 336
Connor Hicks Avatar asked Nov 25 '15 00:11

Connor Hicks


1 Answers

Skip the raw NSData format

Why not use AVAudioPlayer all the way? If you positively need NSData, you can always load such data from the soundURL below. In this example, the disk buffer is something like:

let soundURL = documentDirectory.URLByAppendingPathComponent("sound.m4a")

It makes sense to record directly to a file anyway for optimal memory and resource management. You get NSData from your recording this way:

let data = NSFileManager.defaultManager().contentsAtPath(soundURL.path())

The code below is all you need:

Record

if !audioRecorder.recording {
    let audioSession = AVAudioSession.sharedInstance()
    do {
        try audioSession.setActive(true)
        audioRecorder.record()
    } catch {}
}

Play

if (!audioRecorder.recording){
    do {
        try audioPlayer = AVAudioPlayer(contentsOfURL: audioRecorder.url)
        audioPlayer.play()
    } catch {}
}

Setup

let audioSession = AVAudioSession.sharedInstance()
do {
    try audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord)
    try audioRecorder = AVAudioRecorder(URL: self.directoryURL()!,
        settings: recordSettings)
    audioRecorder.prepareToRecord()
} catch {}

Settings

let recordSettings = [AVSampleRateKey : NSNumber(float: Float(44100.0)),
    AVFormatIDKey : NSNumber(int: Int32(kAudioFormatMPEG4AAC)),
    AVNumberOfChannelsKey : NSNumber(int: 1),
    AVEncoderAudioQualityKey : NSNumber(int: Int32(AVAudioQuality.Medium.rawValue))]

Download Xcode Project:

You can find this very example here. Download the full project, which records and plays on both simulator and device, from Swift Recipes.

like image 82
SwiftArchitect Avatar answered Nov 02 '22 15:11

SwiftArchitect