Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Audio Array Buffer to Audio Element

I'm creating a chrome app that decrypts mp3s sent from my PBX server to my gmail account and plays them. I have completed everything except for the audio player in gmail. I have two options:

  1. Use Web Audio API (I got it working but can't figure out how to make a fully functional seek bar).
  2. Create an createObjectURL from the array and pass to either audio tag or soundmanager2.

I want to reuse code as much as possible and haven't been able to find a pre-made Web Audio API player with seekbar. So, I attempted to try option 2 and the following is as far as I went

window.AudioContext = window.AudioContext || window.webkitAudioContext;
var context = new AudioContext();
context.decodeAudioData(arr.buffer, function (soundBuffer) {
    windowURL = window.URL || window.webkitURL;
    var audio = document.createElement("audio");
    audio.src = windowURL.createObjectURL([soundBuffer]);
    var someDiv = document.getElementById("testDiv");
    someDiv.appendChild(audio);
    audio.onload = function (e) {
        windowURL.revokeObjectURL(this.src);
    }
}, function (err) {
   console.log("couldnt decode buffer");
});

It fails with "Failed to execute 'createObjectURL' on 'URL': No function was found that matched the signature provided." How should I properly code this function to create an url that can used by chrome's mp3 player or soundmanager2?

like image 222
SILENT Avatar asked Jun 23 '14 18:06

SILENT


People also ask

What is audio buffer?

An audio buffer holds a single buffer of audio data in its mData field. The buffer can represent two types of audio: A single, monophonic, noninterleaved channel of audio. Interleaved audio with the number of channels set by the mNumberChannels field.

How do I play AudioBuffer?

The AudioBuffer interface represents a short audio asset residing in memory, created from an audio file using the AudioContext. decodeAudioData() method, or from raw data using AudioContext. createBuffer() . Once put into an AudioBuffer, the audio can then be played by being passed into an AudioBufferSourceNode .


2 Answers

You need to first create a Blob, and pass that as an argument for createObjectURL

....
const blob = new Blob([soundBuffer], { type: "audio/wav" });
audio.src = window.URL.createObjectURL(blob);
....

Source

like image 105
Mahesh Avatar answered Oct 22 '22 18:10

Mahesh


Yes, you can do it

  1. Fetch arraybuffer and decode audio
  2. Create MediaSource with listener 'sourceopen'
  3. Use method appendBuffer to add you decoded audio to MediaSource

https://developer.mozilla.org/en-US/docs/Web/API/MediaSource

like image 2
Dima Melnik Avatar answered Oct 22 '22 16:10

Dima Melnik