What is a ray-casting in ARKit and RealityKit for?
And when I need to use a makeRaycastQuery instance method:
func makeRaycastQuery(from point: CGPoint,
allowing target: ARRaycastQuery.Target,
alignment: ARRaycastQuery.TargetAlignment) -> ARRaycastQuery?
Any help appreciated.
The RealityKit framework was built from the ground up specifically for augmented reality with photo-realistic rendering, camera effects, animations, physics, and more.
Raycast Extensions | Open Source Alternative to Alfred.
Casting a Ray You can cast a ray with the WorldRoot:Raycast() method (WorldRoot:Raycast()) from a Vector3 origin in a Vector3 direction. When casting a ray, the distance between the origin and directional Vector3 is the functional length (magnitude) of the ray. The maximum length is 5,000 studs.
Simple Ray-Casting
, the same way as Hit-Testing
, helps to locate a 3D point on a real-world surface by projecting an imaginary ray from a screen point onto detected plane. In Apple documentation (2019) there was the following definition of ray-casting:
Ray-casting is the preferred method for finding positions on surfaces in the real-world environment, but the hit-testing functions remain present for compatibility. With
tracked ray-casting
, ARKit and RealityKit continue to refine the results to increase the position accuracy of virtual content you place with a ray-cast.
When the user wants to place a virtual content onto detected surface, it's a good idea to have a tip for this. Many AR apps draw a focus circle or square that give the user visual confirmation of the shape and alignment of the surfaces that ARKit is aware of. So, to find out where to put a focus circle or a square in the real world, you may use an ARRaycastQuery
to ask ARKit where any surfaces exist in the real world.
Here's an example where you can see how to implement the raycast(query)
instance method:
import UIKit
import RealityKit
class ViewController: UIViewController {
@IBOutlet var arView: ARView!
let model = try! Entity.loadModel(named: "usdzModel")
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
self.raycasting()
}
fileprivate func raycasting() {
guard let query = arView.makeRaycastQuery(from: arView.center,
allowing: .estimatedPlane,
alignment: .horizontal)
else { return }
guard let result = arView.session.raycast(query).first
else { return }
let raycastAnchor = AnchorEntity(world: result.worldTransform)
raycastAnchor.addChild(model)
arView.scene.anchors.append(raycastAnchor)
}
}
If you wanna know how to use a Convex-Ray-Casting
in RealityKit, read this post.
If you wanna know how to use Hit-Testing
in RealityKit, read this post.
Here's a sample code where you can find out how to implement a raycasting logic in SwiftUI:
import SwiftUI
import RealityKit
struct ContentView: View {
@State private var arView = ARView(frame: .zero)
var model = try! Entity.loadModel(named: "robot")
var body: some View {
ARViewContainer(arView: $arView)
.onTapGesture(count: 1) { self.raycasting() }
.ignoresSafeArea()
}
fileprivate func raycasting() {
guard let query = arView.makeRaycastQuery(from: arView.center,
allowing: .estimatedPlane,
alignment: .horizontal)
else { return }
guard let result = arView.session.raycast(query).first
else { return }
let raycastAnchor = AnchorEntity(world: result.worldTransform)
raycastAnchor.addChild(model)
arView.scene.anchors.append(raycastAnchor)
}
}
and then...
struct ARViewContainer: UIViewRepresentable {
@Binding var arView: ARView
func makeUIView(context: Context) -> ARView { return arView }
func updateUIView(_ uiView: ARView, context: Context) { }
}
P.S.
If you're building either of these two app variations from scratch (i.e. not using Xcode AR template), don't forget to enable the Privacy - Camera Usage Description
key in the Info
tab.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With