I've read all Apple guides about ARKit, and watched a WWDC video. But I can't understand how do coordinate systems which are bind to:
connect to each other.
I can add an object, for example a SCNPlane
:
let stripe = SCNPlane(width: 0.005, height: 0.1)
let stripeNode = SCNNode(geometry: stripe)
scene.rootNode.addChildNode(stripeNode)
This will produce a white stripe, which will be oriented vertically, no matter how the device will be oriented at that moment. That means the coordinate system is somehow bound to the gravity! But if I try to print upAxis
attribute of the SCNScene per frame, it will be the same, no matter how I rotate the iPhone. I also tried to print stripeNode.worldTransform
and it also doesn't change.
Any help in understanding ARKit
coordinates is welcomed.
By default, the world coordinate system in ARKit is based on ARKit's understanding of the real world around your device. (And yes, it's oriented to gravity, thanks to the device's motion sensing hardware.)
Also by default, when you use ARSCNView
to display SceneKit content in an ARKit session, the coordinate system of the scene's rootNode
is matched to the ARKit world coordinate system. That means that "up" in scene space is always the same as "up" in real-world space, no matter how you move your device.
Aside: The scene attributes API where you found an
upAxis
aren't what you think they are. Those are for the inherent settings of scenes loaded from files — e.g. someone sends you a DAE file designed in a 3D authoring tool where the Z axis means "up", so that attribute tells SceneKit to rotate the data in that file to fit SceneKit's Y-up coordinate system convention.If you want to find the up axis of some SceneKit node relative to world space, the
worldUp
orsimdWorldUp
property is what you want. However, those are relative to world space. There's no API for asking what direction world space itself points in, because that would mean a direction relative to something else... and world space is the "absolute" coordinate system that everything else is relative to. So you have to rely on definition.
The great thing about this matching of coordinate systems is that you can put SceneKit things in real-world space very easily. Your code places a white stripe at 0, 0, 0
because you didn't explicitly give it a position. In an ARKit session, 0, 0, 0
is the position of your device when you started the session... so you should be able to run that code and then take a step backwards to see your white stripe.
In short, the coordinate system model for ARKit is this: The world is fixed, and your device/camera moves within it.
This means that if you want to do anything relative to the current position/orientation of the device, you need a conversion to camera space.
If you're working with SceneKit, that's easy: view.pointOfView
gives you the SceneKit camera node, so you can...
node.position = view.pointOfView.simdWorldFront + float3(0, 0, -0.5)
to drop a node 50 cm in front of where the camera is right now)If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With