I would like to build a realistic view of the earth from low-orbit (here ~300km) with WebGL. That is to say, on the web, with all that it implies and moreover, on mobile. Do not stop reading here : to make this a little less difficult, the user can look everywhere but not pan, so the view does only concern a small 3000km-wide area. But the view follows a satellite so few minutes later, the user comes back to where it was before, with the slight shift of the earth's rotation, etc. So the clouds cannot be at the same place all the time.
I have actually yet been able to include city lights, auroras, lightnings... except clouds. I have seen a lot of demos of realtime rendering passionates and researchers, but none of them had a nice, realistic cloud layer. However I am sure I am the 100(...)00th person thinking about doing this, so please enlight me.
Few questions are implied :
Here are some ideas summing up what I have not seen yet, sorted by importance :
clouds hide 60% of the earth.
clouds scatter cities & lightnings'lights and have rayleigh scattering at night.
At this distance the parallax effect is visible and even quite awesome with the smallest clouds.
As far as i've seen, even expensive realtime meteorological online resources are not useful : they aim rainy or stormy clouds with help of UV and IR lightwaves, so they don't catch 100% of them and don't give the 'normal' view we all know. Moreover the rare good cloud textures shot in visible light hardly differentiate ground from clouds : sometimes a 5000km-long coast stands among nowhere. A server may be able to use those images to create better textures.
When I look at those pictures I imagine that the lest costy way would be to merge few nice cloud meshes from a database containing different models, then slightly transform those meshes inside a shader while the user passes over. If he is still here 90 minutes later when he comes back, no matter if the model are not the same again. However a hurrican cannot disappear.
What do you think about this ?
For such effects there is probably just one way to do it properly and that is:
Voxel maps + Volume rendering probably with Back-Ray-tracer rendering
As your position is fixed so it should not be so hard on memory requirements. You need to implement both MIE and Rayleigh scattering. Scattering can be simplified a lot and still looking good see
voxel maps handle light gaps,shadows and scattering relatively easy but need a lot of memory and computational power. All the other 2D techniques just usually painfully work around what 3D voxel maps do natively with little effort. For example:
Procedural cloud map generators
You need this for each type of clouds so you have something to render. There are libs/demos/examples out there see:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With