I'm building a simple app in which I'm trying to get the buffer, but seems that onaudio process in the following code isn't firing: (PasteBin)
<script>
var audio_context;
var recorder;
window.onload = function init() {
try {
window.AudioContext = window.AudioContext || window.webkitAudioContext;
navigator.getUserMedia = navigator.getUserMedia || navigator.webkitGetUserMedia;
window.URL = window.URL || window.webkitURL;
audio_context = new AudioContext;
} catch (e) {
console.log(e);
}
navigator.getUserMedia({audio: true}, startUserMedia);
function startUserMedia(stream) {
console.log('Initializing');
var input = audio_context.createMediaStreamSource(stream);
input.connect(audio_context.destination);
var node = input.context.createGain(4096, 2, 2);
node.onaudioprocess = function(e){
console.log('done');
}
node.connect(audio_context.destination);
}
};
</script>
If the code works as it should I should get Initiliazing \n done
, the problem is that I'm getting only Initiazing and onaudioprocess isn't fired. I'm using the lastest chrome:
onaudioprocess
is not a property of GainNode
, but of ScriptProcessorNode
. See the API reference here.
I'm not really experienced with Web Audio API, but if I understood correctly you need to insert it between you gain node and your destination to be able to process those events:
var node = input.context.createGain(4096, 2, 2);
var processor = input.context.createScriptProcessor(4096,1,1);
processor.onaudioprocess = function(e){
console.log('done');
}
node.connect(processor);
processor.connect(audio_context.destination);
Example at jsFiddle. As you can see, it's printing done
to the console as the stream is processed, but I can say nothing about the correctness of this setup (since, as I said, little experience) so please double check the connections between nodes - and if necessary adjust the first parameter (the buffer size).
Note: I'm assuming you want to do something that alters the stream (it wasn't clear in your question). If you want to to somehing else (for instance, just analyse it) but won't be changing the input, them you can connect the nodes like you did before (node
and destination
) and create the ScriptProcessorNode
with one input but no outputs:
var node = input.context.createGain(4096, 2, 2);
node.connect(audio_context.destination);
var processor = input.context.createScriptProcessor(4096,1,0);
processor.onaudioprocess = function(e){
console.log('done');
}
node.connect(processor);
You can also bypass the WebAudio event and use Audio's own timeupdate
event. It might suit your purpose if you don't need a high-resolution event (such as audioprocess
).
As opposed to audioprocess
, timeupdate
is firing only when the audio position is changed: when it is actually playing or you seek to another position. It is said to fire once in approximately 250 ms (so it's low-frequency and thus more performant).
function startUserMedia(stream) {
stream.ontimeupdate = function () {
console.log(stream.currentTime);
};
/* … */
}
P.S.
In your example, you want to console.log('done')
in the onaudioprocess
handler. That makes me think that you misunderstand the purpose of the event. It is fired continuously, not once something is done with the stream.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With