Thank you to everyone who takes the time to read the question!
So I've made a stream using MultipeerConnectivity. I am able to record audio into a CMSampleBuffer and convert that buffer into UInt8 data. Then send that data to a peer using the method:
outputStream!.write(u8ptr, maxLength: Int(buffer.mDataByteSize))
Then when the data shows up on the inputStream the following method is called:
func stream(_ aStream: Stream, handle eventCode: Stream.Event) {
I have print statements, so this part is running fine. When data actually shows up, I call my function
func readFromStream() {
I know that I need to call the inputStream.read method to actually read from the stream, but I'm not sure how to actually read the data and then convert it to NSData so that it can be played using AVAudioPlayer.
(Unless you're aware of a better, more efficient way)
This is what I have so far, but I haven't tested it and I assume there is going to be problems.
func readFromStream() {
var buffer = [UInt8](repeating: 0, count: 1024)
while (inputStream!.hasBytesAvailable) {
let length = inputStream!.read(&buffer, maxLength: buffer.count)
if (length > 0) {
if (audioEngine!.isRunning) {
audioEngine!.stop()
audioEngine!.reset()
}
print("\(#file) > \(#function) > \(length) bytes read")
let audioBuffer = bytesToAudioBuffer(buffer)
let mainMixer = audioEngine!.mainMixerNode
audioEngine!.connect(audioPlayer!, to: mainMixer, format: audioBuffer.format)
audioPlayer!.scheduleBuffer(audioBuffer, completionHandler: nil)
do {
try audioEngine!.start()
}
catch let error as NSError {
print("\(#file) > \(#function) > error: \(error.localizedDescription)")
}
audioPlayer!.play()
}
}
}
Based on what I have, there is no audio playing. It is silence, but I can see that audio is being received by one of the devices.
So basically, my question is, how do I convert this buffer to the correct data type so that it can be played live?
Thank you for your help! If you need more information, please let me know.
Rather than using CMSampleBuffers I used AVAudioPCMBuffers. These can be created by recording from an AVAudioEngine. Basically here is how I converted the AVAudioPCMBuffer to NSData and back.
func audioBufferToNSData(PCMBuffer: AVAudioPCMBuffer) -> NSData {
let channelCount = 1 // given PCMBuffer channel count is 1
let channels = UnsafeBufferPointer(start: PCMBuffer.floatChannelData, count: channelCount)
let data = NSData(bytes: channels[0], length:Int(PCMBuffer.frameLength * PCMBuffer.format.streamDescription.pointee.mBytesPerFrame))
return data
}
func dataToPCMBuffer(format: AVAudioFormat, data: NSData) -> AVAudioPCMBuffer {
let audioBuffer = AVAudioPCMBuffer(pcmFormat: format,
frameCapacity: UInt32(data.length) / format.streamDescription.pointee.mBytesPerFrame)
audioBuffer.frameLength = audioBuffer.frameCapacity
let channels = UnsafeBufferPointer(start: audioBuffer.floatChannelData, count: Int(audioBuffer.format.channelCount))
data.getBytes(UnsafeMutableRawPointer(channels[0]) , length: data.length)
return audioBuffer
}
To convert UInt8 into NSData > NSData from UInt8
Once you've done that, just use AVAudioPlayer
let player = AVAudioPlayer(data: data)
player.play()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With