I tried creating the two views in a view controller, and running them at the same time, but that doesn't work, the ARView overtakes ARSCNView, even if it's not hooked up to an outlet. I then tried adding one view to the other, and that doesn't work either.
@IBOutlet var arView: ARView!
@IBOutlet var sceneView: ARSCNView!
This works on its own for ARView:
let anchor = try! Glasses.loadScene()
arView.scene.anchors.append(anchor)
arView.session.run(ARFaceTrackingConfiguration())
This works on its own for ARSCNView, and then I track everything in the delegate functions
sceneView.session.run(ARFaceTrackingConfiguration())
sceneView.delegate = self
This does not work:
let ar = ARSCNView(frame: view.frame)
ar.delegate = self
ar.session.run(ARFaceTrackingConfiguration())
arView.addSubview(ar)
This does not work:
let ar = ARView(frame: view.frame)
let anchor = try! Glasses.loadScene()
ar.scene.anchors.append(anchor)
ar.session.run(ARFaceTrackingConfiguration())
sceneView.addSubview(ar)
When I tried adding one view to the other I commented out the session.run calls, but that didn't make a difference.
It's rather a hardware constraints than inability to technically run 2 separate ARSessions for ARView and for ARSCNView. If you run two ARViews, for example, you'll see that the live camera signal is present in both views, but tracking does not work.
If you energize the ARSCNView's session from ARView's session, then both views will work. However, this solution is for rear camera's configuration.
sceneView.session = arView.session
I built it on iPhone X (Xcode 13.2.1, iOS 15.3.1), and should say that it works fine.
import ARKit
import RealityKit
class ViewController: UIViewController {
var sceneView = ARSCNView(frame: .zero)
var arView = ARView(frame: .zero)
override func viewDidLoad() {
super.viewDidLoad()
// ARSCNView
sceneView.frame = CGRect(origin: CGPoint(x: 0, y: 0),
size: CGSize(width: 400, height: 400))
self.view.addSubview(sceneView)
sceneView.scene = SCNScene()
sceneView.autoenablesDefaultLighting = true
let sphere = SCNNode(geometry: SCNSphere(radius: 0.2))
sphere.position.z = -1.0
sceneView.scene.rootNode.addChildNode(sphere)
// SESSIONS
sceneView.session = arView.session
// ARView
arView.frame = CGRect(origin: CGPoint(x: 0, y: 405),
size: CGSize(width: 400, height: 400))
self.view.addSubview(arView)
let boxScene = try! Experience.loadBox()
arView.scene.anchors.append(boxScene)
}
}
Let's check a config:
DispatchQueue.main.asyncAfter(deadline: .now() + 2.0) {
print(self.arView.session.configuration!)
}
Result:
<ARWorldTrackingConfiguration: 0x282ce79c0
worldAlignment=Gravity
lightEstimation=Enabled
frameSemantics=None
videoFormat=< ARVideoFormat: 0x283ad9bd0
imageResolution=(1920, 1440)
framesPerSecond=(60)
captureDeviceType=AVCaptureDeviceTypeBuiltInWideAngleCamera
captureDevicePosition=(1) >
autoFocus=Enabled
environmentTexturing=Automatic
wantsHDREnvironmentTextures=Enabled
planeDetection=Horizontal
collaboration=Disabled
userFaceTracking=Disabled
sceneReconstruction=None
appClipCodeTracking=Disabled>
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With