I brought this up in my last post but since it was off topic from the original question I'm posting it separately. I'm having trouble with getting my transmitted audio to play back through Web Audio the same way it would sound in a media player. I have tried 2 different transmission protocols, binaryjs and socketio, and neither make a difference when trying to play through Web Audio. To rule out the transportation of the audio data being the issue I created an example that sends the data back to the server after it's received from the client and dumps the return to stdout. Piping that into VLC results in a listening experience that you would expect to hear.
To hear the results when playing through vlc, which sounds the way it should, run the example at https://github.com/grkblood13/web-audio-stream/tree/master/vlc using the following command:
$ node webaudio_vlc_svr.js | vlc -
For whatever reason though when I try to play this same audio data through Web Audio it fails miserably. The results are random noises with large gaps of silence in between.
What is wrong with the following code that is making the playback sound so bad?
window.AudioContext = window.AudioContext || window.webkitAudioContext;
var context = new AudioContext();
var delayTime = 0;
var init = 0;
var audioStack = [];
client.on('stream', function(stream, meta){
stream.on('data', function(data) {
context.decodeAudioData(data, function(buffer) {
audioStack.push(buffer);
if (audioStack.length > 10 && init == 0) { init++; playBuffer(); }
}, function(err) {
console.log("err(decodeAudioData): "+err);
});
});
});
function playBuffer() {
var buffer = audioStack.shift();
setTimeout( function() {
var source = context.createBufferSource();
source.buffer = buffer;
source.connect(context.destination);
source.start(context.currentTime);
delayTime=source.buffer.duration*1000; // Make the next buffer wait the length of the last buffer before being played
playBuffer();
}, delayTime);
}
Full source: https://github.com/grkblood13/web-audio-stream/tree/master/binaryjs
Play a Sound with Web Audio API is a tutorial that explains some common methods of triggering and toggling buffered sounds with the Web Audio API. It will build on what we’ve covered in Web Audio API Basics and Web Audio API Audio Buffer.
To play a sound with Web Audio API, it must be buffered first. If a buffered sound is very short and isn’t being looped, there’s probably no need for a stop function. However, if a buffered sound is longer than a few seconds or looped, please consider the sanity of your users and include a stop function.
With the recent growth of smart phones and other mobile devices, this one will surely gain importance in the future. The Web Audio API gives us the ability to fire off sounds at timed intervals with a high degree of accuracy.
You really can't just call source.start(audioContext.currentTime) like that.
setTimeout() has a long and imprecise latency - other main-thread stuff can be going on, so your setTimeout() calls can be delayed by milliseconds, even tens of milliseconds (by garbage collection, JS execution, layout...) Your code is trying to immediately play audio - which needs to be started within about 0.02ms accuracy to not glitch - on a timer that has tens of milliseconds of imprecision.
The whole point of the web audio system is that the audio scheduler works in a separate high-priority thread, and you can pre-schedule audio (starts, stops, and audioparam changes) at very high accuracy. You should rewrite your system to:
1) track when the first block was scheduled in audiocontext time - and DON'T schedule the first block immediately, give some latency so your network can hopefully keep up.
2) schedule each successive block received in the future based on its "next block" timing.
e.g. (note I haven't tested this code, this is off the top of my head):
window.AudioContext = window.AudioContext || window.webkitAudioContext;
var context = new AudioContext();
var delayTime = 0;
var init = 0;
var audioStack = [];
var nextTime = 0;
client.on('stream', function(stream, meta){
stream.on('data', function(data) {
context.decodeAudioData(data, function(buffer) {
audioStack.push(buffer);
if ((init!=0) || (audioStack.length > 10)) { // make sure we put at least 10 chunks in the buffer before starting
init++;
scheduleBuffers();
}
}, function(err) {
console.log("err(decodeAudioData): "+err);
});
});
});
function scheduleBuffers() {
while ( audioStack.length) {
var buffer = audioStack.shift();
var source = context.createBufferSource();
source.buffer = buffer;
source.connect(context.destination);
if (nextTime == 0)
nextTime = context.currentTime + 0.05; /// add 50ms latency to work well across systems - tune this if you like
source.start(nextTime);
nextTime+=source.buffer.duration; // Make the next buffer wait the length of the last buffer before being played
};
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With