Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the real benefit of using Raycast in ARKit and RealityKit?

What is a ray-casting in ARKit and RealityKit for?

And when I need to use a makeRaycastQuery instance method:

func makeRaycastQuery(from point: CGPoint, 
                 allowing target: ARRaycastQuery.Target, 
                       alignment: ARRaycastQuery.TargetAlignment) -> ARRaycastQuery?

Any help appreciated.

like image 726
Andy Jazz Avatar asked Jun 05 '19 19:06

Andy Jazz


People also ask

What is RealityKit?

The RealityKit framework was built from the ground up specifically for augmented reality with photo-realistic rendering, camera effects, animations, physics, and more.

Is Raycast open source?

Raycast Extensions | Open Source Alternative to Alfred.

How do you Raycast on Roblox?

Casting a Ray You can cast a ray with the WorldRoot:Raycast() method (WorldRoot:Raycast()) from a Vector3 origin in a Vector3 direction. When casting a ray, the distance between the origin and directional Vector3 is the functional length (magnitude) of the ray. The maximum length is 5,000 studs.


1 Answers

Simple Ray-Casting, the same way as Hit-Testing, helps to locate a 3D point on a real-world surface by projecting an imaginary ray from a screen point onto detected plane. In Apple documentation (2019) there was the following definition of ray-casting:

Ray-casting is the preferred method for finding positions on surfaces in the real-world environment, but the hit-testing functions remain present for compatibility. With tracked ray-casting, ARKit and RealityKit continue to refine the results to increase the position accuracy of virtual content you place with a ray-cast.

When the user wants to place a virtual content onto detected surface, it's a good idea to have a tip for this. Many AR apps draw a focus circle or square that give the user visual confirmation of the shape and alignment of the surfaces that ARKit is aware of. So, to find out where to put a focus circle or a square in the real world, you may use an ARRaycastQuery to ask ARKit where any surfaces exist in the real world.


UIKit implementation

Here's an example where you can see how to implement the raycast(query) instance method:

import UIKit
import RealityKit

class ViewController: UIViewController {
    
    @IBOutlet var arView: ARView!
    let model = try! Entity.loadModel(named: "usdzModel")
    
    override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
        self.raycasting()
    }

    fileprivate func raycasting() {
            
        guard let query = arView.makeRaycastQuery(from: arView.center,
                                              allowing: .estimatedPlane,
                                             alignment: .horizontal)
        else { return }

        guard let result = arView.session.raycast(query).first
        else { return }

        let raycastAnchor = AnchorEntity(world: result.worldTransform)
        raycastAnchor.addChild(model)
        arView.scene.anchors.append(raycastAnchor)
    }
}

If you wanna know how to use a Convex-Ray-Casting in RealityKit, read this post.


If you wanna know how to use Hit-Testing in RealityKit, read this post.


SwiftUI implementation

Here's a sample code where you can find out how to implement a raycasting logic in SwiftUI:

import SwiftUI
import RealityKit

struct ContentView: View {
    
    @State private var arView = ARView(frame: .zero)
    var model = try! Entity.loadModel(named: "robot")
    
    var body: some View {            
        ARViewContainer(arView: $arView)
            .onTapGesture(count: 1) { self.raycasting() }
            .ignoresSafeArea()
    }
    
    fileprivate func raycasting() {                    
        guard let query = arView.makeRaycastQuery(from: arView.center,
                                              allowing: .estimatedPlane,
                                             alignment: .horizontal)
        else { return }

        guard let result = arView.session.raycast(query).first
        else { return }

        let raycastAnchor = AnchorEntity(world: result.worldTransform)
        raycastAnchor.addChild(model)
        arView.scene.anchors.append(raycastAnchor)
    }
}

and then...

struct ARViewContainer: UIViewRepresentable {
    
    @Binding var arView: ARView
    
    func makeUIView(context: Context) -> ARView { return arView }
    func updateUIView(_ uiView: ARView, context: Context) { }
}

P.S.

If you're building either of these two app variations from scratch (i.e. not using Xcode AR template), don't forget to enable the Privacy - Camera Usage Description key in the Info tab.

like image 72
Andy Jazz Avatar answered Oct 07 '22 23:10

Andy Jazz