Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

LiDAR and RealityKit – Capture a Real World Texture for a Scanned Model

Task

I would like to capture a real-world texture and apply it to a reconstructed mesh produced with a help of LiDAR scanner. I suppose that Projection-View-Model matrices should be used for that. A texture must be made from fixed Point-of-View, for example, from center of a room. However, it would be an ideal solution if we could apply an environmentTexturing data, collected as a cube-map texture in a scene.

enter image description here

Look at 3D Scanner App. It's a reference app allowing us to export a model with its texture.

I need to capture a texture with one iteration. I do not need to update it in a realtime. I realize that changing PoV leads to a wrong texture's perception, in other words, distortion of a texture. Also I realize that there's a dynamic tesselation in RealityKit and there's an automatic texture mipmapping (texture's resolution depends on a distance it captured from).

import RealityKit
import ARKit
import Metal
import ModelIO

class ViewController: UIViewController, ARSessionDelegate {
    
    @IBOutlet var arView: ARView!

    override func viewDidLoad() {
        super.viewDidLoad()

        arView.session.delegate = self
        arView.debugOptions.insert(.showSceneUnderstanding)

        let config = ARWorldTrackingConfiguration()
        config.sceneReconstruction = .mesh
        config.environmentTexturing = .automatic
        arView.session.run(config)
    }
}

Question

  • How to capture and apply a real world texture to a reconstructed 3D mesh?


like image 567
Andy Jazz Avatar asked Sep 08 '20 12:09

Andy Jazz


People also ask

Can LiDAR be used for 3D scanning?

The LiDAR 3D scanner is a 3D scanning device based on LiDAR technology that can capture the shape and size of a physical object and display them on a computer in the form of digitized three-dimensional coordinates in a point clouds file. Not all LiDAR scanners feature SLAM.

What is LiDAR 3D scanning?

LiDAR is a remote sensing technology. LiDAR technology uses the pulse from a laser to collect measurements. These are used to create 3D models and maps of objects and environments.

How accurate are LiDAR scans?

For aerial uses such as terrain surveying, both LiDAR and photogrammetry can provide an accuracy of 1 to 3 cm (0.4 to 1.2 inches). When scanning at very short distances like 10 cm (4 in) away from the object, the accuracy of LiDAR can be as precise as 10 μm, less than the thickness of a hair.


2 Answers

How it can be done in Unity

I'd like to share some interesting info about the work of Unity's AR Foundation with a mesh coming from LiDAR. At the moment – November 01, 2020 – there's an absurd situation. It's associated with the fact that native ARKit developers cannot capture the texture of a scanned object using standard high-level RealityKit tools, however Unity's AR Foundation users (creating ARKit apps) can do this using the ARMeshManager script. I don't know whether this script was developed by AR Foundation team or just by developers of a small creative startup (and then subsequently bought), but the fact remains.

To use ARKit meshing with AR Foundation, you just need to add the ARMeshManager component to your scene. As you can see on the picture there are such features as Texture Coordinates, Color and Mesh Density.

enter image description here

If anyone has more detailed information on how this must be configured or scripted in Unity, please post about it in this thread.

like image 62
Andy Jazz Avatar answered Oct 19 '22 20:10

Andy Jazz


You can check out the answer over here

It's a description of this project: MetalWorldTextureScan which demonstrates how to scan your environment and create a textured mesh using ARKit and Metal.

like image 1
beatTheSystem42 Avatar answered Oct 19 '22 19:10

beatTheSystem42