Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Displaying a DAE file with ARKit and tracking an Anchor in the scene

I'm trying out ARKit, and I set up an ARSCNView using this tutorial.

Then set up tracking horizontal 3D planes with the second part of this tutorial.

I created a single view application then constrained an ARSCNView flush to the root view with an outlet to my ViewController.

Here is the code in the ViewController:

import UIKit
import ARKit

class ViewController: UIViewController {

    //MARK: Properties
    @IBOutlet weak var arScene: ARSCNView!

    //MARK: ARKit variables
    var realityConfiguration: ARWorldTrackingSessionConfiguration?

    //MARK: Lifecycle
    override func viewDidLoad() {
        super.viewDidLoad()         
    }

    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)
        self.prepare()
    }

    //MARK: Actions      

    //MARK: Overrides
}

extension ViewController {
    func prepare() {
        //Check to see if active reality is supported
        guard ARSessionConfiguration.isSupported else {
            //Custom alert function that just quickly displays a UIAlertController
            AppDelegate.alert(title: "Not Supported", message: "Active Reality is not supported on this device")
            return
        }
        //Set up the ARSessionConfiguration
        self.realityConfiguration = ARWorldTrackingSessionConfiguration()
        //Set up the ARSCNView
        guard let config = self.realityConfiguration else {
            return
        }
        //Run the ARSCNView and set its delegate
        self.arScene.session.run(config)
        self.arScene.delegate = self
    }
}

extension ViewController: ARSCNViewDelegate {
    func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
        return nil
    }

    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
        guard let planAnchor = anchor as? ARPlaneAnchor else {
            return
        }

        let plane = SCNPlane(width: CGFloat(planAnchor.extent.x), height: CGFloat(planAnchor.extent.z))
        let planeNode = SCNNode(geometry: plane)
        planeNode.position = SCNVector3Make(planAnchor.center.x, 0, planAnchor.center.z)

        planeNode.transform = SCNMatrix4MakeRotation(-Float.pi / 2, 1, 0, 0)

        node.addChildNode(planeNode)          
    }

    func renderer(_ renderer: SCNSceneRenderer, willUpdate node: SCNNode, for anchor: ARAnchor) {
        print("Will updated Node on Anchor: \(anchor.identifier)")
    }

    func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
        print("Did updated Node on Anchor: \(anchor.identifier)")
    }

    func renderer(_ renderer: SCNSceneRenderer, didRemove node: SCNNode, for anchor: ARAnchor) {
        print("Removed Node on Anchor: \(anchor.identifier)")
    }
}

I downloaded Xcode 9 beta, followed Apple's tutorials and realized my iPhone 6 does not have the A9 chip required for the ARWorldTrackingSessionConfiguration object.

About halfway down in the first tutorial I linked, Apple says you can still create AR experiences without the A9 chip. However, they do not go into further detail. Has anyone else found a starting point, and is willing to provide a code example of using a .dae file and

  • Choosing an anchor point to display it
  • Tracking that anchor point
  • Actually displaying the .dae file
like image 634
Jon Vogel Avatar asked Jun 07 '17 02:06

Jon Vogel


1 Answers

There's not really anything to see using your code — just a live camera view.

The main reason you're not seeing any augmentations in your reality is that your code adds SceneKit content to the scene only when anchors are added to the ARSession... but you're not manually adding any anchors, and you haven't enabled plane detection so ARKit isn't automatically adding anchors. If you enable plane detection, you'll start getting somewhere...

self.realityConfiguration = ARWorldTrackingSessionConfiguration()
realityConfiguration?.planeDetection = .horizontal

But you still won't see anything. That's because your ARSCNViewDelegate implementation has conflicting instructions. This part:

func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
    return nil
}

...means that no SceneKit nodes will be created for your anchors. Because there's no nodes, your renderer(_:didAdd:for:) function is never being called, so your code within that method is never creating any SceneKit content.

If you turn on plane detection and delete / comment out the renderer(_: nodeFor:) method, the rest of your code should get you something like this:

(The pure white area is your SCNPlane. I had to unfold my iPad cover on the white table get enough scene detail for plane detection to find anything. Also, check the background... there was actually a moment at WWDC today where the merch store wasn't packed full of people.)

As for whether you need an A9, Apple's messaging is a little unclear here. When they say ARKit requires A9 or better, what they really mean is ARWorldTrackingSessionConfiguration does. And that's where all the best AR magic is. (There's even a UIRequiredDeviceCapabilities key for arkit that actually covers devices with world tracking support, so you can restrict your app on the App Store to being offered only to those devices.)

There's still some ARKit without world tracking, though. Run a session with the base class ARSessionConfiguration and you get orientation-only tracking. (No position tracking, no plane detection, no hit testing.)

What does that get you? Well, if you've played the current version of Pokémon GO, it works like that: because it tracks only device orientation, not position, you can't get closer to Pikachu or walk around behind him — the illusion that he occupies the real world holds as long as you only tilt or turn your device without moving it.

You load 3D content with SceneKit and place it in AR just like you load and place it in any other SceneKit app/game. There are plenty of resources out there for this, and lots of ways to do it. One of them you'll find in the Xcode template when you create a new AR project and choose SceneKit. The loading part goes something like this:

let scene = SCNScene(named: "ship.scn" inDirectory: "assets.scnassets")
let ship = scene.rootNode.childNode(withName: "ship", recursively: true)

Then to place it:

ship.simdPosition = float3(0, 0, -0.5) 
// half a meter in front of the *initial* camera position
myARSCNView.scene.rootNode.addChildNode(ship)

The main difference to remember for placing things in AR is that positions are measured in meters (and your content needs to be designed so that it's sensibly sized in meters).

like image 192
rickster Avatar answered Sep 17 '22 12:09

rickster