Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is there a way to use the Web Audio API to sample audio faster than real-time?

I'm playing around with the Web Audio API & trying to find a way to import an mp3 (so therefore this is only in Chrome), and generate a waveform of it on a canvas. I can do this in real-time, but my goal is to do this faster than real-time.

All the examples I've been able to find involve reading the frequency data from an analyser object, in a function attached to the onaudioprocess event:

processor = context.createJavascriptNode(2048,1,1);
processor.onaudioprocess = processAudio;
...
function processAudio{
    var freqByteData = new Uint8Array(analyser.frequencyBinCount);
    analyser.getByteFrequencyData(freqByteData);
    //calculate magnitude & render to canvas
}

It appears though, that analyser.frequencyBinCount is only populated when the sound is playing (something about the buffer being filled).

What I want is to be able to manually/programmatically step through the file as fast as possible, to generate the canvas image.

What I've got so far is this:

$("#files").on('change',function(e){
    var FileList = e.target.files,
        Reader = new FileReader();

    var File = FileList[0];

    Reader.onload = (function(theFile){
        return function(e){
            context.decodeAudioData(e.target.result,function(buffer){
                source.buffer = buffer;
                source.connect(analyser);
                analyser.connect(jsNode);

                var freqData = new Uint8Array(buffer.getChannelData(0));

                console.dir(analyser);
                console.dir(jsNode);

                jsNode.connect(context.destination);
                //source.noteOn(0);
            });
        };
    })(File);

    Reader.readAsArrayBuffer(File);
});

But getChannelData() always returns an empty typed array.

Any insight is appreciated - even if it turns out it can't be done. I think I'm the only one the Internet not wanting to do stuff in real-time.

Thanks.

like image 505
Quasipickle Avatar asked Nov 10 '11 02:11

Quasipickle


People also ask

Which of the following best describes AudioNode?

The AudioNode interface is a generic interface for representing an audio processing module.


1 Answers

There is a really amazing 'offline' mode of the Web Audio API that allows you to pre-process an entire file through an audio context and then do something with the result:

var context = new webkitOfflineAudioContext();

var source = context.createBufferSource();
source.buffer = buffer;
source.connect(context.destination);
source.noteOn(0);

context.oncomplete = function(e) {
  var audioBuffer = e.renderedBuffer;
};

context.startRendering();

So the setup looks exactly the same as the real-time processing mode, except you set up the oncomplete callback and the call to startRendering(). What you get back in e.redneredBuffer is an AudioBuffer.

like image 192
ebidel Avatar answered Sep 22 '22 16:09

ebidel