I would like to capture
a real-world texture and apply it to a reconstructed mesh produced with a help of LiDAR scanner. I suppose that Projection-View-Model matrices should be used for that. A texture must be made from fixed Point-of-View, for example, from center of a room. However, it would be an ideal solution if we could apply an environmentTexturing
data, collected as a cube-map
texture in a scene.
Look at 3D Scanner App. It's a reference app allowing us to export a model with its texture.
I need to capture a texture with one iteration. I do not need to update it in a realtime. I realize that changing PoV leads to a wrong texture's perception, in other words, distortion of a texture. Also I realize that there's a dynamic tesselation in RealityKit and there's an automatic texture mipmapping (texture's resolution depends on a distance it captured from).
import RealityKit
import ARKit
import Metal
import ModelIO
class ViewController: UIViewController, ARSessionDelegate {
@IBOutlet var arView: ARView!
override func viewDidLoad() {
super.viewDidLoad()
arView.session.delegate = self
arView.debugOptions.insert(.showSceneUnderstanding)
let config = ARWorldTrackingConfiguration()
config.sceneReconstruction = .mesh
config.environmentTexturing = .automatic
arView.session.run(config)
}
}
The LiDAR 3D scanner is a 3D scanning device based on LiDAR technology that can capture the shape and size of a physical object and display them on a computer in the form of digitized three-dimensional coordinates in a point clouds file. Not all LiDAR scanners feature SLAM.
LiDAR is a remote sensing technology. LiDAR technology uses the pulse from a laser to collect measurements. These are used to create 3D models and maps of objects and environments.
For aerial uses such as terrain surveying, both LiDAR and photogrammetry can provide an accuracy of 1 to 3 cm (0.4 to 1.2 inches). When scanning at very short distances like 10 cm (4 in) away from the object, the accuracy of LiDAR can be as precise as 10 μm, less than the thickness of a hair.
How it can be done in Unity
I'd like to share some interesting info about the work of Unity's AR Foundation with a mesh coming from LiDAR. At the moment – November 01, 2020 – there's an absurd situation. It's associated with the fact that native ARKit developers cannot capture the texture of a scanned object using standard high-level RealityKit tools, however Unity's AR Foundation users (creating ARKit apps) can do this using the ARMeshManager
script. I don't know whether this script was developed by AR Foundation team or just by developers of a small creative startup (and then subsequently bought), but the fact remains.
To use ARKit meshing with AR Foundation, you just need to add the ARMeshManager component to your scene. As you can see on the picture there are such features as Texture Coordinates
, Color
and Mesh Density
.
If anyone has more detailed information on how this must be configured or scripted in Unity, please post about it in this thread.
You can check out the answer over here
It's a description of this project: MetalWorldTextureScan which demonstrates how to scan your environment and create a textured mesh using ARKit and Metal.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With