Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

WebGL Reading pixels properly from offscreen canvas

I know there are a lot of resources on this, but none of them have worked for me.
Some are: webgl readpixels is always returning 0,0,0,0,
and this one: https://stackoverflow.com/questions/44869599/readpixels-from-webgl-canvas\ as well as this one: Read pixels from a WebGL texture
but none of them have been either helpful or successful.

The goal: Render an offscreen canvas with a WebGL shader, then use that as a texture in a separate WebGL shader.

Notes:

  • For these WebGL shaders, I'm using a generic vertex shader used for pixel shaders, specifically, a raytracer/raymarcher. This is: attribute vec2 a_position; void main() { gl_Position = vec4(a_position.xy, 0.0, 1.0); }. This vertex shader is inputted two triangles that cover the screen, so basically the fragment shader is doing all the work.

Problem: In order to get the image data off of the offscreen canvas, I've tried these methods:

  1. The WebGL gl.readPixels function
var capturedImageData = new Float32Array(screenWidth * screenHeight * 4);
gl.readPixels(0, 0, screenWidth, screenHeight, gl.RGBA, gl.FLOAT, capturedImageData);
  1. Using another canvas with getContext('2d')
var offscreenCanvas = document.createElement("canvas");
offscreenCanvas.width = screenWidth;
offscreenCanvas.height = screenHeight;
var ctx = offscreenCanvas.getContext('2d');
ctx.drawImage(glCanvas, 0, 0);
var capturedImageData = ctx.getImageData(0, 0, screenWidth, screenHeight);

Both of these methods result in the capturedImageData array being filled with
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, . . . , which is obviously not right.

If anybody has experience with this problem, help would be greatly appreciated. Thanks!

A direct link to the program is: https://www.khanacademy.org/cs/-/6289631977619456

like image 291
xacer Avatar asked Oct 26 '20 19:10

xacer


2 Answers

You should always look in the JavaScript console of your browser (press F12) or pick it from the menus. I should have shown an error for your code

var capturedImageData = new Float32Array(screenWidth * screenHeight * 4);
gl.readPixels(0, 0, screenWidth, screenHeight, gl.RGBA, gl.FLOAT, capturedImageData);

const gl = document.createElement('canvas').getContext('webgl');
const screenWidth = 300;
const screenHeight = 150;
var capturedImageData = new Float32Array(screenWidth * screenHeight * 4);
gl.readPixels(0, 0, screenWidth, screenHeight, gl.RGBA, gl.FLOAT, capturedImageData);

When I run your code I get this error

js:16 WebGL: INVALID_ENUM: readPixels: invalid type

You can't read with FLOAT. See this answer

But, what you're trying to do, render to one texture and use that as input to another shader should not be done with readPixels. It should be done with framebuffers allowing you to directly render to a texture.

See this or maybe one of these

like image 157
gman Avatar answered Oct 23 '22 04:10

gman


For some reason gl.readPixels works better with a Uint8Array.

var capturedImageData = new Uint8Array(screenWidth * screenHeight * 4);
gl.readPixels(0, 0, screenWidth, screenHeight, gl.RGBA, gl.UNSIGNED_BYTE, capturedImageData);

Here is a demo:

<!DOCTYPE html>
<html lang="en">
  <head>
    <meta charset="utf-8" />

  </head>
  <canvas></canvas>
  <body>
    <script>
      var gl = document.querySelector("canvas").getContext("webgl");    

    
     gl.clearColor(1, 0, 1, 1);
     gl.clear(gl.COLOR_BUFFER_BIT);
    
    
      var pixels = new Uint8Array(4 * window.innerWidth * window.innerHeight);
      gl.readPixels(0, 0, window.innerWidth, window.innerHeight, gl.RGBA, gl.UNSIGNED_BYTE, pixels);
      
      
      console.log(pixels[0]);
    

  
    </script>
  </body>
</html>
like image 44
Ollie Avatar answered Oct 23 '22 05:10

Ollie