I want to record a video from a HTML <canvas>
element at a specific frame rate.
I am using CanvasCaptureMediaStream with canvas.captureStream(fps)
and also have access to the video track via const track = stream.getVideoTracks()[0]
so I create track.requestFrame()
to write it to the output video buffer via MediaRecorder
.
I want to precisely capture one frame at a time and then change the canvas content. Changing the canvas content can take some time (as images need to be loaded etc). So I can not capture the canvas in real-time. Some changes on the canvas would happen in 500ms real-time so this needs also to be adjusted to rendering one frame at the time.
The MediaRecorder API is meant to record live-streams, doing edition is not what it was designed to do, and it doesn't do it very well to be honest...
The MediaRecorder itself has no concept of frame-rate, this is normally defined by the MediaStreamTrack. However, the CanvasCaptureStreamTrack doesn't really make it clear what its frame rate is.
We can pass a parameter to HTMLCanvas.captureStream()
, but this only tells the max frames we want per seconds, it's not really an fps parameter.
Also, even if we stop drawing on the canvas, the recorder will still continue to extend the duration of the recorded video in real time (I think that technically only a single long frame is recorded though in this case).
So... we gonna have to hack around...
One thing we can do with the MediaRecorder is to pause()
and resume()
it.
Then sounds quite easy to pause before doing the long drawing operation and to resume right after it's been made? Yes... and not that easy either...
Once again, the frame-rate is dictated by the MediaStreamTrack, but this MediaStreamTrack can not be paused.
Well, actually there is one way to pause a special kind of MediaStreamTrack, and luckily I'm talking about CanvasCaptureMediaStreamTracks.
When we do call our capture-stream with a parameter of 0
, we are basically having manual control over when new frames are added to the stream.
So here we can synchronize both our MediaRecorder adn our MediaStreamTrack to whatever frame-rate we want.
The basic workflow is
await the_long_drawing_task;
resumeTheRecorder();
writeTheFrameToStream(); // track.requestFrame();
await wait( time_per_frame );
pauseTheRecorder();
Doing so, the recorder is awaken only the time per frame we decided, and a single frame is passed to the MediaStream during this time, effectively mocking a constant FPS drawing for what the MediaRecorder is concerned.
But as always, hacks in this still experimental area come with a lot of browsers weirdness and the following demo actually only works in current Chrome...
For whatever reasons, Firefox will always generate files with twice the number of frames than what has been requested, and it will also occasionally prepend a long first frame...
Also to be noted, Chrome has a bug where it will update the canvas stream at drawing, even though we initiated this stream with a frameRequestRate of 0
. So this means that if you start drawing before everything is ready, or if the drawing on your canvas itself takes a long time, then our recorder will record half-baked frames that we didn't asked for.
To workaround this bug, we thus need to use a second canvas, used only for the streaming. All we'll do on that canvas is to drawImage the source one, which will always be a fast enough operation. to not face that bug.
class FrameByFrameCanvasRecorder {
constructor(source_canvas, FPS = 30) {
this.FPS = FPS;
this.source = source_canvas;
const canvas = this.canvas = source_canvas.cloneNode();
const ctx = this.drawingContext = canvas.getContext('2d');
// we need to draw something on our canvas
ctx.drawImage(source_canvas, 0, 0);
const stream = this.stream = canvas.captureStream(0);
const track = this.track = stream.getVideoTracks()[0];
// Firefox still uses a non-standard CanvasCaptureMediaStream
// instead of CanvasCaptureMediaStreamTrack
if (!track.requestFrame) {
track.requestFrame = () => stream.requestFrame();
}
// prepare our MediaRecorder
const rec = this.recorder = new MediaRecorder(stream);
const chunks = this.chunks = [];
rec.ondataavailable = (evt) => chunks.push(evt.data);
rec.start();
// we need to be in 'paused' state
waitForEvent(rec, 'start')
.then((evt) => rec.pause());
// expose a Promise for when it's done
this._init = waitForEvent(rec, 'pause');
}
async recordFrame() {
await this._init; // we have to wait for the recorder to be paused
const rec = this.recorder;
const canvas = this.canvas;
const source = this.source;
const ctx = this.drawingContext;
if (canvas.width !== source.width ||
canvas.height !== source.height) {
canvas.width = source.width;
canvas.height = source.height;
}
// start our timer now so whatever happens between is not taken in account
const timer = wait(1000 / this.FPS);
// wake up the recorder
rec.resume();
await waitForEvent(rec, 'resume');
// draw the current state of source on our internal canvas (triggers requestFrame in Chrome)
ctx.clearRect(0, 0, canvas.width, canvas.height);
ctx.drawImage(source, 0, 0);
// force write the frame
this.track.requestFrame();
// wait until our frame-time elapsed
await timer;
// sleep recorder
rec.pause();
await waitForEvent(rec, 'pause');
}
async export () {
this.recorder.stop();
this.stream.getTracks().forEach((track) => track.stop());
await waitForEvent(this.recorder, "stop");
return new Blob(this.chunks);
}
}
///////////////////
// how to use:
(async() => {
const FPS = 30;
const duration = 5; // seconds
let x = 0;
let frame = 0;
const ctx = canvas.getContext('2d');
ctx.textAlign = 'right';
draw(); // we must have drawn on our canvas context before creating the recorder
const recorder = new FrameByFrameCanvasRecorder(canvas, FPS);
// draw one frame at a time
while (frame++ < FPS * duration) {
await longDraw(); // do the long drawing
await recorder.recordFrame(); // record at constant FPS
}
// now all the frames have been drawn
const recorded = await recorder.export(); // we can get our final video file
vid.src = URL.createObjectURL(recorded);
vid.onloadedmetadata = (evt) => vid.currentTime = 1e100; // workaround https://crbug.com/642012
download(vid.src, 'movie.webm');
// Fake long drawing operations that make real-time recording impossible
function longDraw() {
x = (x + 1) % canvas.width;
draw(); // this triggers a bug in Chrome
return wait(Math.random() * 300)
.then(draw);
}
function draw() {
ctx.fillStyle = 'white';
ctx.fillRect(0, 0, canvas.width, canvas.height);
ctx.fillStyle = 'black';
ctx.fillRect(x, 0, 50, 50);
ctx.fillText(frame + " / " + FPS * duration, 290, 140);
};
})().catch(console.error);
<canvas id="canvas"></canvas>
<video id="vid" controls></video>
<script>
// Some helpers
// Promise based timer
function wait(ms) {
return new Promise(res => setTimeout(res, ms));
}
// implements a sub-optimal monkey-patch for requestPostAnimationFrame
// see https://stackoverflow.com/a/57549862/3702797 for details
if (!window.requestPostAnimationFrame) {
window.requestPostAnimationFrame = function monkey(fn) {
const channel = new MessageChannel();
channel.port2.onmessage = evt => fn(evt.data);
requestAnimationFrame((t) => channel.port1.postMessage(t));
};
}
// Promisifies EventTarget.addEventListener
function waitForEvent(target, type) {
return new Promise((res) => target.addEventListener(type, res, {
once: true
}));
}
// creates a downloadable anchor from url
function download(url, filename = "file.ext") {
a = document.createElement('a');
a.textContent = a.download = filename;
a.href = url;
document.body.append(a);
return a;
}
</script>
I asked a similar question which has been linked to this one. In the meantime I came up with a solution which overlaps Kaiido's and which I think is worth reading.
I added two tricks:
const recordFrames = (onstop, canvas, fps=30) => {
const chunks = [];
// get Firefox to initialise the canvas
canvas.getContext('2d').fillRect(0, 0, 0, 0);
const stream = canvas.captureStream();
const recorder = new MediaRecorder(stream);
recorder.addEventListener('dataavailable', ({data}) => chunks.push(data));
recorder.addEventListener('stop', () => onstop(new Blob(chunks)));
const frameDuration = 1000 / fps;
const frame = (next, start) => {
recorder.pause();
api.error += Date.now() - start - frameDuration;
setTimeout(next, 0); // helps Firefox record the right frame duration
};
const api = {
error: 0,
init() {
recorder.start();
recorder.pause();
},
step(next) {
recorder.resume();
setTimeout(frame, frameDuration, next, Date.now());
},
stop: () => recorder.stop()
};
return api;
}
how to use
const fps = 30;
const duration = 5000;
const animation = Something;
const videoOutput = blob => {
const video = document.createElement('video');
video.src = URL.createObjectURL(blob);
document.body.appendChild(video);
}
const recording = recordFrames(videoOutput, canvas, fps);
const startRecording = () => {
recording.init();
animation.play();
};
// I am assuming you can call these from your library
const onAnimationRender = nextFrame => recording.step(nextFrame);
const onAnimationEnd = () => recording.step(recording.stop);
let now = 0;
const progression = () => {
now = now + 1 + recorder.error * fps / 1000;
recorder.error = 0;
return now * 1000 / fps / duration
}
I found this solution to be satisfying at 30fps in both Chrome and Firefox. I didn't experience the Chrome bugs mentionned by Kaiido and thus didn't implement anything to deal with them.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With