I want to map a rectangular texture to a quad with corners in arbitrary positions corresponding to screen pixels. Something like drawQuad( x1,y1, x2,y2, x3,y3, x4,y4 )
. Are there any demos of doing this with WebGL?
Everything that I have done with WebGL has been with Three.js, which makes geometry projection, lighting, and camera easy. But this is simpler: no need for lighting or camera angles, just inputting the absolute position of the four corners. I'm not opposed to using Three.js if I can get on this lower level.
I found an explanation with a GLSL vertex shader in Quartz Composer.
Update
In my project I ended up going with CSS for now, because I can stretch any element (canvas, img, div, video), and don't get the tearing with non-planar shapes... Eventually I might revisit the WebGL option. If anybody has an example dragging four corners to arbitrary positions, I'll add it here.
I strongly suggest reading tutorials from "Learning webGL".
For every vertex of quad you send its "UV coordinates". UV coordinates are vec2 data that tell which part of the texture is bound with vertex.
So for example, (0,0) uv represents upper left corner of the texture, while (0.3, 0.4) represents position on the texture that corresponds to 30% width of the texture and 40% heights.
In GLSL shader, there is a special function - texture2D, and here's the usage:
uniform sampler2D uTexture;
varying vec2 vUV;
void main()
vec4 color_from_texture = texture2D ( uTexture, vUV );
gl_FragColor = color_from_texture;
}
This is called sampling texture, and you read data from texture at some position by calling texture2D function.
So for quad, vertices would have positions (x1, y1), (x2, y1), (x1, y2), (x2, y2) and their corresponding UV coordinates would be (0.0, 0.0), (1.0, 0,0), (0.0, 1.0), (1.0, 1.0). Note how we streched texture by defining UV coordinates fully from one point to another.
Vertex shader would look something like this:
attribute vec2 aUV;
attribute vec2 aVertexPos;
varying vec2 vUV;
void main() {
vUV = aUv;
gl_Position = vec4( aVertexPos, 1.0); // maybe modelViewMatrix * projMatrix * vec4(aVertexPos, 1.0)
}
Shader will interpolate UV coordinates between vertices that combine one triangle so each fragment will get its own, interpolated value fro UV and thus will sample different texture data.
Read carefully this, and after that, creating quad should be easy: http://learningwebgl.com/blog/?p=507
Hope this helps.
Live example: http://abstract-algorithm.com/quad.html
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With