I am streaming an arrayBuffer to convert to an audioBuffer in order to be able to listen to it.
I am receiving a stream via a websocket event
retrieveAudioStream(){
this.socket.on('stream', (buffer) => {
console.log('buffer', buffer)
})
}
the buffer
is an arrayBuffer and I need it to be an audioBuffer
in order to be able to listen to it on my application.
How can I do this?
You can use BaseAudioContext.createBuffer()
method.
It is used to
create a new, empty
AudioBuffer
object, which can then be populated by data, and played via anAudioBufferSourceNode
See MDN for more info: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createBuffer
Since you're streaming media rather than downloading the file and then decoding the audio data, AudioContext.createMediaStreamSource()
will be much better suited for your usecase.
Read more here https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/createMediaStreamSource
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With