Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

SceneKit and CoreMotion "lagging" on iPhone 5 but normal on iPhone 6

I was testing with SceneKit, so I decided to make a virtual reality test app. I would get the data from device motion and would then change the camera angles based on the data. I was testing on an iPhone 6, so, no problem. The thing gets weird when I run it on iPhone 5 and a friend's iPhone 5S, the camera will take some delay to rotate, very different then on iPhone 6, which is instant. The FPS on Xcode keeps around 60 FPS on both devices, I added some time tests and the motion data is coming around 60Hz as well on both devices. I'm lost.

The base for the code is this project: https://github.com/stevenjs/VR-iOS-Experiment

If possible, I would like some tips on how to fix it, not only the code ready to be copy & pasted. I wanna learn.

I can't explain it well with words, so I've recorded some videos to show you what's going on:

iPhone 6: https://www.youtube.com/watch?v=pdyTEwvsR1I

iPhone 5: https://www.youtube.com/watch?v=MlDdKtHkrYo

Here's the code:

    // Create Scene
    var scene = SCNScene()
    scnView.scene = scene
    //scnView.autoenablesDefaultLighting = true

    //Add camera to scene.
    let camera = SCNCamera()
    camera.xFov = 45
    camera.yFov = 45

    let camerasNode = SCNNode()
    camerasNode.camera = camera
    camerasNode.position = SCNVector3(x: 0, y: 0, z: 0)
    // The user will be holding their device up (i.e. 90 degrees roll from a flat orientation)
    // so roll the camera by -90 degrees to orient the view correctly
    // otherwise the object will be created "below" the user
    camerasNode.eulerAngles = SCNVector3Make(degreesToRadians(-90.0), 0, 0)

    let cameraRollNode = SCNNode()
    cameraRollNode.addChildNode(camerasNode)

    let cameraPitchNode = SCNNode()
    cameraPitchNode.addChildNode(cameraRollNode)

    let cameraYawNode = SCNNode()
    cameraYawNode.addChildNode(cameraPitchNode)

    scene.rootNode.addChildNode(cameraYawNode)

    scnView.pointOfView = camerasNode

    // Ambient Light
    let ambientLight = SCNLight()
    ambientLight.type = SCNLightTypeAmbient
    ambientLight.color = UIColor(white: 0.1, alpha: 1.0)
    let ambientLightNode = SCNNode()
    ambientLightNode.light = ambientLight
    scene.rootNode.addChildNode(ambientLightNode)

    // Omni Light
    let diffuseLight = SCNLight()
    diffuseLight.type = SCNLightTypeOmni
    diffuseLight.color = UIColor(white: 1.0, alpha: 1.0)
    let diffuseLightNode = SCNNode()
    diffuseLightNode.light = diffuseLight
    diffuseLightNode.position = SCNVector3(x: -30, y: 30, z: 50)
    scene.rootNode.addChildNode(diffuseLightNode)

            let material = SCNMaterial()
    material.diffuse.contents = UIColor.redColor()
    material.locksAmbientWithDiffuse = true
    let material2 = SCNMaterial()
    material2.diffuse.contents = UIColor.whiteColor()
    material2.locksAmbientWithDiffuse = true
    let material3 = SCNMaterial()
    material3.diffuse.contents = UIColor.blueColor()
    material3.locksAmbientWithDiffuse = true
    let material4 = SCNMaterial()
    material4.diffuse.contents = UIColor.purpleColor()
    material4.locksAmbientWithDiffuse = true
    let material5 = SCNMaterial()
    material5.diffuse.contents = UIColor.yellowColor()
    material5.locksAmbientWithDiffuse = true
    let material6 = SCNMaterial()
    material6.diffuse.contents = UIColor.orangeColor()
    material6.locksAmbientWithDiffuse = true


    //Create the box
    let baseBox = SCNBox(width: 5, height: 5, length: 5, chamferRadius: 0)

    baseBox.materials = [material, material2, material3, material4, material5, material6]
    let boxNode = SCNNode(geometry: baseBox)
    boxNode.position = SCNVector3(x: 0, y: 3, z: -10)
    scene.rootNode.addChildNode(boxNode)


    // Respond to user head movement
    motionManager = CMMotionManager()
    motionManager?.deviceMotionUpdateInterval = 1.0 / 60.0

    motionManager?.startDeviceMotionUpdatesUsingReferenceFrame(CMAttitudeReferenceFrameXArbitraryZVertical,
        toQueue: NSOperationQueue.mainQueue(),
        withHandler: { (motion: CMDeviceMotion!, error: NSError!) -> Void in
            self.counter++
            let currentAttitude = motion.attitude
            let roll = Float(currentAttitude.roll)
            let pitch = Float(currentAttitude.pitch)
            let yaw = Float(currentAttitude.yaw)

            //only working at 60FPS on iPhone 6... WHY

            //according to documentation, SCNVector3 from a node is, (pitch, yaw, node)
            cameraRollNode.eulerAngles = SCNVector3Make(0.0, 0.0, -roll)
            cameraPitchNode.eulerAngles = SCNVector3Make(pitch, 0.0, 0.0)
            cameraYawNode.eulerAngles = SCNVector3Make(0.0, yaw, 0.0)
    })
like image 392
GBF_Gabriel Avatar asked Mar 26 '15 16:03

GBF_Gabriel


1 Answers

Well this was just a hunch, but I guess it turned out correct so let's make it formal!

Your problem seems to be that there are two 60Hz timers/loops contending for time on the main thread — the CoreMotion update queue, and the SceneKit rendering loop. You're effectively communicating from one to the other, in that you're setting SceneKit state in the CoreMotion update handler, and hoping that the SceneKit render loop picks it up on the same frame.

(I suspect the hardware difference is that the iPhone 6 has enough performance overhead that the timing happens to work out okay, and the older hardware doesn't — so SceneKit is picking up model positions that are a frame or two behind your motion data.)

Instead, do everything in the same 60Hz loop. (Really, this is good general advice, whether your other 60Hz event source is CoreMotion or something else — in a game engine or similar real-time renderer, you should let the target framerate for rendering dictate everything else you do.)

That means you should be polling for motion data, not getting it pushed to you — you only care about the motion state as of the timestamp of the frame you're rendering, not the sample from 1/60 sec before, or the last several samples if you dropped a frame. Hook into the SceneKit rendering loop with a scene renderer delegate, read CoreMotion's deviceMotion in your renderer:updateAtTime: method and set your model rotations there. Note that you still need to call startDeviceMotionUpdatesUsingReferenceFrame: before you can start polling for motion data.

Also, SceneKit normally runs its render loop only when it knows the scene content needs to be animating — e.g. if you've attached an animation to something in the scene, or set an animatable property in a transaction, or using actions or physics. (That way, if the next frame rendered would be the same as the last one, it won't run down your device's battery drawing the same thing again and again.) But when you're expecting to make changes during the animation loop, you need to make sure it's running all the time.

Setting the playing property on your view to true is actually the right way to do that — it tells SceneKit you want it to keep rendering, not stop when animations stop. (Yeah, the documentation could probably be clearer on that point... if you let Apple know they'll probably fix it.)


Finally, a Swift tip: you can assign directly to members in properties with struct types without needing to construct a whole new value, so instead of doing this:

cameraRollNode.eulerAngles = SCNVector3Make(0.0, 0.0, -roll)
cameraPitchNode.eulerAngles = SCNVector3Make(pitch, 0.0, 0.0)
cameraYawNode.eulerAngles = SCNVector3Make(0.0, yaw, 0.0)

You can do this:

cameraRollNode.eulerAngles.z = -roll
cameraPitchNode.eulerAngles.x = pitch
cameraYawNode.eulerAngles.y = yaw
like image 166
rickster Avatar answered Nov 17 '22 17:11

rickster