I understand that the below quote is very correct for the Project Google Tango:
By combining Depth Perception with Motion Tracking, the device can measure distances between points in an area that aren't in the same frame.
Based on the above, below are a few questions:
1. Can ARCore be used to measure the distance as the Project Google Tango does?
2. How accurate is the result in comparison to the Project Google Tango?
ARCore SDK has plane detection capability. In order to use the app, the user has to pick two points using the phone camera to measure the distance between them. The user can also use this app to measure the distance between multiple points.
Planes and points are a special type of object called a trackable. Like the name suggests, these are objects that ARCore will track over time.
ARCore uses SLAM (Simultaneous Localization And Mapping) to understand the position of your phone relative to your surrounding. Once the feature points are detected, SLAM uses them to compute the change in location.
Anchors ensure that objects appear to stay at the same position and orientation in space, helping you maintain the illusion of virtual objects placed in the real world.
Ian M partially answers the first part of your question with this answer. Here's how you might do it:
Pose startPose = startAnchor.getPose(); Pose endPose = hitResult.getHitPose(); // Clean up the anchor session.removeAnchors(Collections.singleton(startAnchor)); startAnchor = null; // Compute the difference vector between the two hit locations. float dx = startPose.tx() - endPose.tx(); float dy = startPose.ty() - endPose.ty(); float dz = startPose.tz() - endPose.tz(); // Compute the straight-line distance. float distanceMeters = (float) Math.sqrt(dx*dx + dy*dy + dz*dz);
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With