Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What causes parts of SceneKit geometry to "pop" in and out when viewed from different angles?

I am fairly novice when working with 3D, but I will do my best to explain my situation.

I am importing some models (DAE from Maya) into SceneKit (XCode 9) that use semi-transparent materials. In this specific case I will refer to a light bulb model. When viewing the model straight on, the filament inside is visible through the outer "glass" layer. enter image description here When I pivot the model along the x axis even just slightly, the filament disappears. enter image description here

I am noticing this behavior in other situations with geometry contained inside other geometry with semi-transparent materials. I am also using Physically Based Lighting. Has anyone else encountered this? Thank you in advance.

like image 360
Michael T Avatar asked Oct 05 '17 20:10

Michael T


1 Answers

In the second image, your light bulb has burned out. You should go to the hardware store and buy a new light bulb. 😉

Okay, seriously...

Transparency in 3D rendering requires special care. Normally, when SceneKit (or any graphics technology, really) renders an opaque object, it writes to two outputs in the render target: one for color, and one for depth. Having depth information during the rendering process means that opaque objects correctly appear in front of or behind each other, regardless of what order they render in, and enables some performance optimization.

For example, if I draw a teapot, and then draw a set of teacups around the teapot, the GPU can quickly decide whether to keep color pixels from the teapot or replace them with pixels from the teacups depending on their relative depth. (And if the depth test says it doesn't need some of those pixels, it doesn't need to waste time rendering them.

Depth tests work only for opaque objects, though. For semitransparent objects, you have to manage render order. If you want one object's pixels to blend on top of another's, you render opaque things and transparent things farthest from the camera first, and the transparent things closest to the camera last.

SceneKit has a few tricks for managing render order, but in your case there's an extra wrinkle: you have a single semitransparent surface, so parts of the surface need to render in front of other parts, and which parts are which depend on orientation relative to the camera. High level scene-graph frameworks like SceneKit usually aren't much help there, because they render all the triangles in a mesh at once, so you're stuck with whatever order they were defined in, leading to situations like you see on the left in the image below.

transparency modes from SceneKit WWDC17 talk

However, SceneKit in iOS 11 (and tvOS 11 and macOS 10.13 High Sierra, probably watchOS 4 too) has a solution. Take a look at the new transparency modes: if you use the dualLayer setting, SceneKit automatically splits a transparent surface into two render passes, one for the back and another for the front. Not only does this make sure that the triangles in the mesh blend correctly with respect to each other, anything you put in the interior of the object will blend correctly, too.

This is nicely explained in the WWDC17 talk on SceneKit here (where the image above comes from).

In the case of your light bulb, you might also need to check the writesToDepthBuffer property on some of the materials.

like image 62
rickster Avatar answered Nov 01 '22 21:11

rickster