http://bodybrowser.googlelabs.com/
I know the source of their data and even for real-time use, those meshes and textures are often hundreds of Kb or even several Mb each. Even low-res, it's hard to see how they are able to get it working within 1s of loading... I see some evidence of textures streaming if I immediately switch to a different layer but to get even the polygons downloaded in time is impressive.
How have they achieved this - super-low poly counts or just very clever progressive loading?
Some tricks Google do for fast initial Body load and general performance are delta data compression, limiting numeric precision and texture atlases.
Cache headers only help reduce this further on subsequent visits.
For rendering speed (which affects loading time since it's concurrent with that) they also do other staple tricks like vertex array aggregation, vertex array interleaving and GL state sorting.
Google I/O 2011: WebGL Techniques and Performance: http://www.youtube.com/watch?v=rfQ8rKGTVlg
Google Body presentation at WebGL Camp #3: http://www.youtube.com/watch?v=vsHHNClzJPg
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With