I am trying to port this (http://madebyevan.com/webgl-water/) over to THREE. I think I'm getting close (just want the simulation for now, don't care about caustics/refraction yet). I'd like to get it working with shaders for the GPU boost.
Here's my current THREE setup using shaders: http://jsfiddle.net/EqLL9/2/ (the second smaller plane is for debugging what's currently in the WebGLRenderTarget)
What I'm struggling with is reading data back from the WebGLRenderTarget (rtTexture in my example). In the example you'll see the 4 vertices surrounding the center point are displaced upwards. This is correct (after 1 simulation step) as it starts with the center point being the only point of displacement.
If I could read the data back from the rtTexture and update the data texture (buf1) each frame, then the simulation should properly animate. How does one read the data directly from a WebGLRenderTarget? All the examples demonstrate how to send data TO the target (render to it), not read FROM it. Or am I doing it all wrong? Something's telling me I'll have to work with multiple textures and somehow swap back and forth similar to how Evan did it.
TL;DR: How can I copy data from a WebGLRenderTarget to a DataTexture after a call like this:
// render to rtTexture
renderer.render( sceneRTT, cameraRTT, rtTexture, true );
EDIT: May have found the solution at jsfiddle /gero3/UyGD8/9/ Will investigate and report back.
Ok, I figured out how to read the data using native webgl calls:
// Render first scene into texture
renderer.render( sceneRTT, cameraRTT, rtTexture, true );
// read render texture into buffer
var gl = renderer.getContext();
gl.readPixels( 0, 0, simRes, simRes, gl.RGBA, gl.UNSIGNED_BYTE, buf1.image.data );
buf1.needsUpdate = true;
The simulation now animates. However, it doesn't seem to be functioning properly (probably a dumb error I'm overlooking). It seems that the height values are never being damped and I'm not sure why. The data from buf1 is used in the fragment shader, which calculates the new height (red in RGBA), damps the value (multiplies by 0.99), then renders it to a texture. I then read this updated data from the texture back into buf1.
Here's the latest fiddle: http://jsfiddle.net/EqLL9/3/
I'll keep this updated as I progress along.
EDIT: Works great now. Just got normals implemented, and now working on environment reflection and refraction (again purely though shaders). http://relicweb.com/webgl/rt.html
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With