Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Trying to stream audio from microphone to another phone via multipeer connectivity

I am trying to stream audio from the microphone to another iPhone via Apples Multipeer Connectivity framework. To do the audio capturing and playback I am using AVAudioEngine (much thanks to Rhythmic Fistman's answer here).

I receive data from the microphone by installing a tap on the input, from this I am getting a AVAudioPCMBuffer which I then convert to an array of UInt8 which I then stream to the other phone.

But when I am converting the array back to an AVAudioPCMBuffer I get an EXC_BAD_ACCESS exception with the compiler pointing to the method where I am converting the byte array to AVAudioPCMBuffer again.

Here is the code for where I'm taking, converting and streaming the input:

input.installTap(onBus: 0, bufferSize: 2048, format: input.inputFormat(forBus: 0), block: {
                (buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in

                let audioBuffer = self.typetobinary(buffer)
                stream.write(audioBuffer, maxLength: audioBuffer.count)
            })

My both functions for converting the data (taken from Martin.R's answer here):

func binarytotype <T> (_ value: [UInt8], _: T.Type) -> T {
    return value.withUnsafeBufferPointer {
        UnsafeRawPointer($0.baseAddress!).load(as: T.self)
    }

}

func typetobinary<T>(_ value: T) -> [UInt8] {
    var data = [UInt8](repeating: 0, count: MemoryLayout<T>.size)
    data.withUnsafeMutableBufferPointer {
        UnsafeMutableRawPointer($0.baseAddress!).storeBytes(of: value, as: T.self)
    }
    return data
}

And on the receiving end:

func session(_ session: MCSession, didReceive stream: InputStream, withName streamName: String, fromPeer peerID: MCPeerID) {
    if streamName == "voice" {

        stream.schedule(in: RunLoop.current, forMode: .defaultRunLoopMode)
        stream.open()

        var bytes = [UInt8](repeating: 0, count: 8)
        stream.read(&bytes, maxLength: bytes.count)

        let audioBuffer = self.binarytotype(bytes, AVAudioPCMBuffer.self) //Here is where the app crashes

        do {
            try engine.start()

            audioPlayer.scheduleBuffer(audioBuffer, completionHandler: nil)
            audioPlayer.play()
       }catch let error {
            print(error.localizedDescription)

        }
    }
}

The thing is that I can convert the byte array back and forth and play sound from it before I stream it (in the same phone) but not create the AVAudioPCMBuffer on the receiving end. Does anyone know why the conversion doesn't work on the receiving end? Is this the right way to go?

Any help, thoughts/input about this would be much appreciated.

like image 293
nullforlife Avatar asked Sep 15 '16 07:09

nullforlife


1 Answers

Your AVAudioPCMBuffer serialisation/deserialisation is wrong.

Swift3's casting has changed a lot & seems to require more copying than Swift2.

Here's how you can convert between [UInt8] and AVAudioPCMBuffers:

N.B: this code assumes mono float data at 44.1kHz.
You might want to change that.

func copyAudioBufferBytes(_ audioBuffer: AVAudioPCMBuffer) -> [UInt8] {
    let srcLeft = audioBuffer.floatChannelData![0]
    let bytesPerFrame = audioBuffer.format.streamDescription.pointee.mBytesPerFrame
    let numBytes = Int(bytesPerFrame * audioBuffer.frameLength)

    // initialize bytes to 0 (how to avoid?)
    var audioByteArray = [UInt8](repeating: 0, count: numBytes)

    // copy data from buffer
    srcLeft.withMemoryRebound(to: UInt8.self, capacity: numBytes) { srcByteData in
        audioByteArray.withUnsafeMutableBufferPointer {
            $0.baseAddress!.initialize(from: srcByteData, count: numBytes)
        }
    }

    return audioByteArray
}

func bytesToAudioBuffer(_ buf: [UInt8]) -> AVAudioPCMBuffer {
    // format assumption! make this part of your protocol?
    let fmt = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 1, interleaved: true)
    let frameLength = UInt32(buf.count) / fmt.streamDescription.pointee.mBytesPerFrame

    let audioBuffer = AVAudioPCMBuffer(pcmFormat: fmt, frameCapacity: frameLength)
    audioBuffer.frameLength = frameLength

    let dstLeft = audioBuffer.floatChannelData![0]
    // for stereo
    // let dstRight = audioBuffer.floatChannelData![1]

    buf.withUnsafeBufferPointer {
        let src = UnsafeRawPointer($0.baseAddress!).bindMemory(to: Float.self, capacity: Int(frameLength))
        dstLeft.initialize(from: src, count: Int(frameLength))
    }

    return audioBuffer
}
like image 182
Rhythmic Fistman Avatar answered Sep 28 '22 00:09

Rhythmic Fistman