Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Playing a simple sound with web audio api

I've been trying to follow the steps in some tutorials for playback of a simple, encoded local wav or mp3 file with the web Audio API using a button. My code is the following (testAudioAPI.js):

window.AudioContext = window.AudioContext || window.webkitAudioContext;
var context = new AudioContext();
var myBuffer;

clickme = document.getElementById('clickme');
clickme.addEventListener('click',clickHandler);

var request = new XMLHttpRequest();

request.open('GET', 'WoodeBlock_SMan_B.wav', true);

request.responseType = 'arraybuffer';

// Decode asynchronously
request.onload = function() {
  context.decodeAudioData(request.response, function(theBuffer) {
    myBuffer = theBuffer;
  }, onError);
}
request.send();

function playSound(buffer) {
  var source = context.createBufferSource(), g = context.createGain();
  source.buffer = buffer;
  source.start(0);
  g.gain.value = 0.5;
  source.connect(g);
  g.connect(context.destination);
}

function clickHandler(e) {
    playSound(myBuffer);
}

And the HTML file would look like this:

    <!doctype html>
<html>
    <body>
        <button id="clickme">Play</button>
        <script src='testAudioAPI.js'></script>
    </body>
</html>

However, no sound is achieved whatsoever. I've tried several snippets but I still can't figure it out. When I try to generate a sound by synthesizing it by creating an oscillator node, I do get sound, but not with buffers from local files. What would be the problem here? Thank you all.

like image 651
joobla Avatar asked May 27 '15 12:05

joobla


1 Answers

minimalistic approach to modern ES6.

  1. new AudioContext();
  2. context.createBufferSource();
  3. source.buffer = audioBuffer; audioBuffer requests ArrayBuffer data by fetch, and then decodeAudioData decodes to AudioBuffer.
  4. source.start()
<button id="start">playSound</button>
const audioPlay = async url => {
  const context = new AudioContext();
  const source = context.createBufferSource();
  const audioBuffer = await fetch(url)
    .then(res => res.arrayBuffer())
    .then(ArrayBuffer => context.decodeAudioData(ArrayBuffer));

  source.buffer = audioBuffer;
  source.connect(context.destination);
  source.start();
};

document.querySelector('#start').onclick = () => audioPlay('music/music.mp3');

stop play: source.stop();

The Web Audio API cannot be played automatically, you need to be triggered by an event.

Creating multiple AudioContext objects will cause an error, you should log out and then create them.

Failed to construct 'AudioContext': number of hardware contexts reached maximum

const audioPlay = (() => {
  let context = null;
  return async url => {
    if (context) context.close();
    context = new AudioContext();
    const source = context.createBufferSource();
    source.buffer = await fetch(url)
      .then(res => res.arrayBuffer())
      .then(arrayBuffer => context.decodeAudioData(arrayBuffer));
    source.connect(context.destination);
    source.start();
  };
})();

document.querySelector('#start').onclick = () => audioPlay('music/music.mp3');
like image 153
weiya ou Avatar answered Oct 09 '22 17:10

weiya ou