Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Understand coordinate spaces in ARKit

I've read all Apple guides about ARKit, and watched a WWDC video. But I can't understand how do coordinate systems which are bind to:

  1. A real world
  2. A device
  3. A 3D scene

connect to each other.

I can add an object, for example a SCNPlane:

let stripe = SCNPlane(width: 0.005, height: 0.1)
let stripeNode = SCNNode(geometry: stripe)
scene.rootNode.addChildNode(stripeNode)

This will produce a white stripe, which will be oriented vertically, no matter how the device will be oriented at that moment. That means the coordinate system is somehow bound to the gravity! But if I try to print upAxis attribute of the SCNScene per frame, it will be the same, no matter how I rotate the iPhone. I also tried to print stripeNode.worldTransform and it also doesn't change.

Any help in understanding ARKit coordinates is welcomed.

like image 208
kelin Avatar asked Sep 27 '17 13:09

kelin


1 Answers

By default, the world coordinate system in ARKit is based on ARKit's understanding of the real world around your device. (And yes, it's oriented to gravity, thanks to the device's motion sensing hardware.)

Also by default, when you use ARSCNView to display SceneKit content in an ARKit session, the coordinate system of the scene's rootNode is matched to the ARKit world coordinate system. That means that "up" in scene space is always the same as "up" in real-world space, no matter how you move your device.

Aside: The scene attributes API where you found an upAxis aren't what you think they are. Those are for the inherent settings of scenes loaded from files — e.g. someone sends you a DAE file designed in a 3D authoring tool where the Z axis means "up", so that attribute tells SceneKit to rotate the data in that file to fit SceneKit's Y-up coordinate system convention.

If you want to find the up axis of some SceneKit node relative to world space, the worldUp or simdWorldUp property is what you want. However, those are relative to world space. There's no API for asking what direction world space itself points in, because that would mean a direction relative to something else... and world space is the "absolute" coordinate system that everything else is relative to. So you have to rely on definition.

The great thing about this matching of coordinate systems is that you can put SceneKit things in real-world space very easily. Your code places a white stripe at 0, 0, 0 because you didn't explicitly give it a position. In an ARKit session, 0, 0, 0 is the position of your device when you started the session... so you should be able to run that code and then take a step backwards to see your white stripe.


In short, the coordinate system model for ARKit is this: The world is fixed, and your device/camera moves within it.

This means that if you want to do anything relative to the current position/orientation of the device, you need a conversion to camera space.

If you're working with SceneKit, that's easy: view.pointOfView gives you the SceneKit camera node, so you can...

  • add child nodes to that node, and they'll stay "attached" to the camera as you move it (e.g. HUD elements, or maybe a pickaxe if you're making a Minecraft-alike)
  • use the camera node as the target of a constraint to make other content in the scene interact with the camera as you move it (e.g. make a game character look at the camera)
  • use the camera node's transform (or the various convenience properties for accessing parts of the transform) to position other content in your scene (e.g. node.position = view.pointOfView.simdWorldFront + float3(0, 0, -0.5) to drop a node 50 cm in front of where the camera is right now)
like image 126
rickster Avatar answered Nov 17 '22 23:11

rickster