Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do I use texSubImage2D to show sprites in webgl?

I can display my entire sprite (32x512) successfully with this call to gl.texImage2D:

gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, image);

It's squished horizontally, like I expected, but it renders on the screen at least. I'm trying to only display the first 32x32 sprite in the sheet and I assumed that I could simply use gl.texSubImage2D to achieve this effect. I tried a simple replacement of texImage2D with texSubImage2D (with modified parameters) but I just get a black box on the screen. Here's the code I'm using:

gl.texSubImage2D(gl.TEXTURE_2D, 0, 0, 0, 32, 32, gl.RGBA, gl.UNSIGNED_BYTE, image);

Am I missing something about the implementation of texSubImage2D? Is there some other step I have to do? Or is texSubImage2D not the right way to do sprite sheets?

like image 212
Nick Avatar asked Mar 15 '11 15:03

Nick


1 Answers

texSubImage2D is not the function you want. You're running into three problems:

  1. texSubImage2D does not copy a subset of image into the GL texture. It copies the entirety of image on top of the GL texture at a given offset.
  2. texSubImage2D can only modify existing texture data, and will mess up unless texImage2D has been called first for the GL texture.
  3. The calling style you're using for texSubImage2D expects a pixel array instead of an HTMLImageElement.

There are four possible signatures for these methods:

// These two accept the normal HTMLImageElement, etc. for the last param.

texImage2D(enum target, int level,  enum internalformat, enum format,
  enum type, Object object);
texSubImage2D(enum target, int level, int xoffset, int yoffset,
  enum format, enum type, Object object);

// These two accept a Uint8Array[] of pixels as the last parameter, despite
// being documented as wanting an ImageData object. The only reason these
// have a width/height param is *because* they take a pixel array, and GL
// doesn't know how large the image is.

texImage2D(enum target, int level, enum internalformat, long width,
  long height, int border, enum format, enum type, Object pixels);
texSubImage2D(enum target, int level, int xoffset, int yoffset,
  long width, long height, enum format, enum type, Object pixels);

To create a texture using the first 32x32 pixels of your image, do something like this instead:

var spriteCanvas = document.createElement('canvas');
spriteCanvas.width = 32;
spriteCanvas.height = 32;

var spriteContext = spriteCanvas.getContext('2d');
spriteContext.drawImage(image, 0, 0);

gl.bindTexture(gl.TEXTURE_2D, texture);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, spriteCanvas);
like image 109
Nathan Ostgard Avatar answered Oct 18 '22 11:10

Nathan Ostgard