I'd like to take a WebGLTexture object that I've rendered into and use it to create an HTML image element. The goal is to display the result of an offscreen rendering pass for debugging purposes. It should be a lot easier than rendering the texture to a full screen quad, my current debugging method.
Creating a texture from an image element is really easy in WebGL:
var image = new Image();
image.src = "myImg.jpg";
// image loads...
var texture = gl.createTexture();
gl.bindTexture(texture);
gl.texImage2D(_gl.TEXTURE_2D, 0, _gl.RGBA, _gl.RGBA, _gl.UNSIGNED_BYTE, image);
Image loading and decoding is completely taken care of for you.
Is there a similarly easy way to do the reverse? i.e.:
// This doesn't work
var img = new Image(texture);
// But maybe this could
var img = createImageFromTexture(texture);
function createImageFromTexture(texture) {
// ... some combination of tricks ...
}
If there is a way to do this, I'm sure it will be useful in contexts outside of debugging. I'll continue to see if I can find a way to do it, but I feel like someone has had to have attempted this before.
Texture mapping maps a location in a 2D image to a location on a 3D triangle. WebGL uses texture coordinates to perform this mapping. As with so many other aspects of graphics, texture coordinates are percentages. The notation for texture coordinates uses (s,t) to represent an image location.
To load the image into the uniform sampler2D tex , I used: const img = new Image(); img. onload = function() { gl. activeTexture(gl.
WebGL provides a minimum of 8 texture units; the first of these is gl. TEXTURE0 .
You can create a framebuffer backed by a texture and then read the raw pixel data out of the framebuffer using gl.readPixels(). Once you have the data pixel, you can copy them to a 2D canvas using ImageData. Then you can construct an Image by setting the image's src property to canvas.toDataURL().
function createImageFromTexture(gl, texture, width, height) {
// Create a framebuffer backed by the texture
var framebuffer = gl.createFramebuffer();
gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffer);
gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, texture, 0);
// Read the contents of the framebuffer
var data = new Uint8Array(width * height * 4);
gl.readPixels(0, 0, width, height, gl.RGBA, gl.UNSIGNED_BYTE, data);
gl.deleteFramebuffer(framebuffer);
// Create a 2D canvas to store the result
var canvas = document.createElement('canvas');
canvas.width = width;
canvas.height = height;
var context = canvas.getContext('2d');
// Copy the pixels to a 2D canvas
var imageData = context.createImageData(width, height);
imageData.data.set(data);
context.putImageData(imageData, 0, 0);
var img = new Image();
img.src = canvas.toDataURL();
return img;
}
Here is an example of running a node.js server in conjunction with toDataURL() to transmit multiple images directly to disk:
http://www.oampo.co.uk/2011/01/exporting-video-from-webgl/
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With