Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is it possible to have ARView and ARSCNView coexist?

I tried creating the two views in a view controller, and running them at the same time, but that doesn't work, the ARView overtakes ARSCNView, even if it's not hooked up to an outlet. I then tried adding one view to the other, and that doesn't work either.

 @IBOutlet var arView: ARView!
 @IBOutlet var sceneView: ARSCNView!

This works on its own for ARView:

  let anchor = try! Glasses.loadScene()
  arView.scene.anchors.append(anchor)
  arView.session.run(ARFaceTrackingConfiguration())

This works on its own for ARSCNView, and then I track everything in the delegate functions

  sceneView.session.run(ARFaceTrackingConfiguration())
  sceneView.delegate = self

This does not work:

  let ar = ARSCNView(frame: view.frame)
  ar.delegate = self
  ar.session.run(ARFaceTrackingConfiguration())
  arView.addSubview(ar)

This does not work:

  let ar = ARView(frame: view.frame)
  let anchor = try! Glasses.loadScene()
  ar.scene.anchors.append(anchor)
  ar.session.run(ARFaceTrackingConfiguration())
  sceneView.addSubview(ar)

When I tried adding one view to the other I commented out the session.run calls, but that didn't make a difference.

like image 979
Joe Avatar asked Dec 24 '19 04:12

Joe


1 Answers

Hardware constraints

It's rather a hardware constraints than inability to technically run 2 separate ARSessions for ARView and for ARSCNView. If you run two ARViews, for example, you'll see that the live camera signal is present in both views, but tracking does not work.

Working solution

If you energize the ARSCNView's session from ARView's session, then both views will work. However, this solution is for rear camera's configuration.

sceneView.session = arView.session

Code

I built it on iPhone X (Xcode 13.2.1, iOS 15.3.1), and should say that it works fine.

import ARKit
import RealityKit

class ViewController: UIViewController {
    
    var sceneView = ARSCNView(frame: .zero)
    var arView = ARView(frame: .zero)
    
    override func viewDidLoad() {
        super.viewDidLoad()            
        // ARSCNView
        sceneView.frame = CGRect(origin: CGPoint(x: 0, y: 0),
                                   size: CGSize(width: 400, height: 400))
        self.view.addSubview(sceneView)
            
        sceneView.scene = SCNScene()
        sceneView.autoenablesDefaultLighting = true    
        let sphere = SCNNode(geometry: SCNSphere(radius: 0.2))
        sphere.position.z = -1.0
        sceneView.scene.rootNode.addChildNode(sphere)

        // SESSIONS
        sceneView.session = arView.session
            
        // ARView
        arView.frame = CGRect(origin: CGPoint(x: 0, y: 405),
                                size: CGSize(width: 400, height: 400))
        self.view.addSubview(arView)

        let boxScene = try! Experience.loadBox()
        arView.scene.anchors.append(boxScene)
    }
}

Let's check a config:

DispatchQueue.main.asyncAfter(deadline: .now() + 2.0) {
    print(self.arView.session.configuration!)
}

Result:

<ARWorldTrackingConfiguration: 0x282ce79c0 
     worldAlignment=Gravity 
     lightEstimation=Enabled 
     frameSemantics=None 
     videoFormat=< ARVideoFormat: 0x283ad9bd0 
                   imageResolution=(1920, 1440) 
                   framesPerSecond=(60)           
                   captureDeviceType=AVCaptureDeviceTypeBuiltInWideAngleCamera 
                   captureDevicePosition=(1) > 
     autoFocus=Enabled 
     environmentTexturing=Automatic 
     wantsHDREnvironmentTextures=Enabled 
     planeDetection=Horizontal 
     collaboration=Disabled 
     userFaceTracking=Disabled 
     sceneReconstruction=None 
     appClipCodeTracking=Disabled>
like image 171
Andy Jazz Avatar answered Sep 29 '22 13:09

Andy Jazz