We have an app that uses Metal
to render. This app works correctly on devices running iOS11. When using the same app on devices running iOS12, we started getting glitches and sometimes hangs in the rendering. We also tried recompiling for iOS12 and are getting the same bad behavior. On the console we are getting the following different messages:
2018-09-22 09:22:29.508576-0500 OurApp [1286:84481] Execution of the command buffer was aborted due to an error during execution. Discarded (victim of GPU error/recovery) (IOAF code 5)
2018-09-22 09:29:55.654426-0500 OurApp [1286:84625] Execution of the command buffer was aborted due to an error during execution. Caused GPU Hang Error (IOAF code 3)
2018-09-22 09:34:37.718054-0500 OurApp [1286:87354] Execution of the command buffer was aborted due to an error during execution. Ignored (for causing prior/excessive GPU errors) (IOAF code 4)
With the first two messages the rendering seems glitchy, where a blank screen is presented and then finally the rendering occurs on screen. With the last message the rendering doesn't actually occur and the message continues being displayed until we move to a different view.
This app uses SceneKit
, instantiates a SCNView
and uses a default CIContext
. It also uses the Physically Based Lighting model, which forces the Metal
renderer to be used. The app has a simple SCNNode
geometry, a cylinder. Each geometry object of the cylinder gets a normal texture (3 in total). The same diffuse, metalness and roughness values are applied to all the geometry objects of the cylinder.
Has anybody ran into this problem? If so, how did you solve it?
Thanks
UPDATE: The problem seems to be caused when an image is used as the scene's lighting environment:
let scene = SCNScene()
scene.lightingEnvironment.contents = UIImage(named: "ourLightingEnvironmentImage")
When a lighting environment isn't used, the problem goes away. This is starting to look like an Apple bug, we will file one. We are stuck because we need the lighting environment to produce realistic reflections for the models in our app.
Another problem is that there is a case where the semaphore's MTLEvent won't get signaled at all, and that's when the image is presented without using a command buffer. When I made it use a command buffer if we had device-side MTLEvents, I only made it check if there were semaphores to await, not to signal.
Running with MTLEvents=on, synchronousQueueSubmits=true, prefillMetalCommandBuffers=true, maxActiveMetalCommandBuffersPerQueue=2 outside of Xcode result in a blank app window, e.g. nothing happens at all.
Execution of the command buffer was aborted due to an error during execution. Caused GPU Timeout Error (IOAF code 2) This is what I was looking for. What's happening I suspect is that the semaphore wait value is being incremented before the semaphore signal is scheduled.
It's a bug, caused by high resolution environment lightning images.
The solution is to resize these images, 1000x500 is what we are using now. Works perfectly.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With