For my project I record user audio using MediaRecorder and it almost works fine. My problem rises when I wish to display a waveform of the user recording using Wavesurfer.js, which doesn't load my recording. Playing the recording with an Audio element works fine, though.
After trying different sources, it seams that it is because the final .webm file doesn't have much metadata, not even a duration or bitrate (even though I set it in the MediaRecorder options). Here is the output from ffprobe with one of the files:
Input #0, matroska,webm, from '206_3.webm':
Metadata:
encoder : Chrome
Duration: N/A, start: 0.000000, bitrate: N/A
Stream #0:0(eng): Audio: opus, 48000 Hz, mono, fltp (default)
So my question is: am I doing something wrong to record the audio? Here is how I start the recording:
// Somewhere in the code...
this._handleUserMedia(await navigator.mediaDevices.getUserMedia({ audio: true }));
// ... and elsewhere
_handleUserMedia(stream) {
this._mediaRecorder = new MediaRecorder(stream, { audioBitsPerSecond : 64000 });
this._mediaRecorder.ondataavailable = event => {
this._mediaBuffer.push(event.data);
};
this._mediaRecorder.onstop = () => {
// Ajoute le buffer et une URL vers le buffer dans les résultats pour la sauvegarde et le playback
let blob = new Blob(this._mediaBuffer, { type: "audio/webm" });
this.state.results[this.state.currentWordIdx].recordingBlob = blob;
this.state.results[this.state.currentWordIdx].recordingUrl = URL.createObjectURL(blob);
// Réinitialise le buffer pour l'enregistrement suivant
this._mediaBuffer = [];
this._gotoNextWord();
};
this._gotoNextWord();
}
As you can see I create a blob which I save later on with NodeJS's fs.writeFile
. Then when I need to display the waveform, I load the file using fs.readFile
like this:
fs.readFile(`${this.getAppData()}/${filePath}`, (err, buffer) => {
if (err) { reject(err); }
const blob = new Blob([buffer], {type : 'audio/webm'});
resolve(URL.createObjectURL(blob)); // Si besoin d'un ArrayBuffer => toArrayBuffer(buffer)
});
In that case, we can use the MediaRecorder API to record the media streaming. In this article, we will create a basic Video and Audio Recorder website using pure JavaScript and its MediaRecorder API. A select option to let the users choose what type of media (audio or video with audio) to record. The “Stop Recording” button will stop the recording.
After receiving the MediaStream, it creates an instance of MediaRecorder that can record the given MediaStream, makes both the MediaStream and the MediaRecorder global so that we can use them outside the startRecording function — Starts the recording of the given MediaStream by calling the MediaRecorder.start () method.
WebRTC is very popular for accessing device cameras and device microphones and streaming the video or audio media in the browser. But in many cases, we might need to record the streaming for future use or for users (Like user might want to download the streaming, etc.). In that case, we can use the MediaRecorder API to record the media streaming.
You can set the MIME-type of the recorded media, the audio bit rate, video bit rate, etc. MIME-type is a standard that represents the format of the recorded media file ( for example, the two MIME types — “audio/webm”, “video/mp4” indicate an audio webm file and a video mp4 file respectively).
I believe the reason there are so few metadata is that by default, MediaRecorder will produce a variable bit rate file, for which a single bitrate value is not meaningful - presumably (although I'm not sure) leading to lack of clear duration value as well.
The spec has recently allowed to set a constant bitrate for recording, with an implementation soon to land in Chromium (M89)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With