Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Callback event for getUserMedia()

I'm trying to take a snapshot from my webcam with the navigator.mediaDevices.getUserMedia() and canvas.getContext('2d').drawImage() functions.

When I do it like this, it works perfectly:

function init(){
  myVideo = document.getElementById("myVideo") 
  myCanvas = document.getElementById("myCanvas");
  videoWidth = myCanvas.width;
  videoHeight = myCanvas.height;
    
  startVideoStream();
}

function startVideoStream(){
  navigator.mediaDevices.getUserMedia({audio: false, video: { width: videoWidth, height: videoHeight }}).then(function(stream) {
    myVideo.src = URL.createObjectURL(stream);
  }).catch(function(err) {
      console.log("Unable to get video stream: " + err);
  });
}

function snapshot(){
  myCanvas.getContext('2d').drawImage(myVideo, 0, 0, videoWidth, videoHeight);
}
<!DOCTYPE html>
<html>
<head>
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <script src="debug.js"></script>
</head>
<body onload="init()">
    <div id="mainContainer">
        <video id="myVideo" width="640" height="480" autoplay style="display: inline;"></video>
        <canvas id="myCanvas" width="640" height="480" style="display: inline;"></canvas>
        <input type="button" id="snapshotButton" value="Snapshot" onclick="snapshot()"/>
    </div>
</body>
</html>

The thing is, I don't want to use a button click for taking the snapshot, but take the snapshot as soon as the the camera stream is loaded. I tried calling the snapshot() function directly after setting the video source:

function init(){
  myVideo = document.getElementById("myVideo") 
  myCanvas = document.getElementById("myCanvas");
  videoWidth = myCanvas.width;
  videoHeight = myCanvas.height;
    
  startVideoStream();
}

function startVideoStream(){
  navigator.mediaDevices.getUserMedia({audio: false, video: { width: videoWidth, height: videoHeight }}).then(function(stream) {
    myVideo.src = URL.createObjectURL(stream);
    snapshot();
  }).catch(function(err) {
      console.log("Unable to get video stream: " + err);
  });
}

function snapshot(){
  myCanvas.getContext('2d').drawImage(myVideo, 0, 0, videoWidth, videoHeight);
}

But it doesn't work. My canvas stays white. I guess it's because the the camera stream is not fully loaded at this point.

So is there any other event getting fired, which I could use for drawing the snapshot as soon as the camera feed is loaded? Or am I totally on the wrong track?

Thanks in advance!

like image 383
xcess Avatar asked Jan 25 '17 15:01

xcess


1 Answers

Wait for the loadedmetadata event:

navigator.mediaDevices.getUserMedia({video: true})
  .then(stream => {
    video.srcObject = stream;
    return new Promise(resolve => video.onloadedmetadata = resolve);
  })
  .then(() => canvas.getContext('2d').drawImage(video, 0, 0, 160, 120))
  .catch(e => console.log(e));
<video id="video" width="160" height="120" autoplay></video>
<canvas id="canvas" width="160" height="120"></canvas>

The above should work in all browsers (that do WebRTC).

In Chrome you can also do this - but play() doesn't return a promise in any other browser yet.

Also note that URL.createObjectURL(stream) is deprecated. Use srcObject.

Update: Thanks to @KyleMcDonald in comments for pointing out the importance of registering the loadedmetadata listener synchronously with setting the srcObject!—Code updated.

like image 163
jib Avatar answered Sep 30 '22 01:09

jib