How to implement a way to measure distances in real time (video camera?) on the iPhone, like this app that uses a card to compare the size of the card with the actual distance?
Are there any other ways to measure distances? Or how to go about doing this using the card method? What framework should I use?
Google's augmented reality app “Measure” turns ARCore-compatible Android smartphones into digital measuring tapes, as reported by Ars Technica. Using the app appears to be rather simple. Simply launch Measure, point the phone's camera to an object, then pick two points to measure the distance in between.
Open the Measure app, then follow any onscreen instructions that ask you to move your device around. This gives your device a frame of reference for the object you're measuring and the surface it's on. Keep moving your device until a circle with a dot in the center appears.
Researchers found the iPhone's CoreMotion Pedometer underestimated steps by a mean of just 7.2% and demonstrated a mean percent difference of 5.7% when compared to an ActiGraph GT9X Activity Monitor.
Apple's latest products, the iPhone 12 Pro and Pro Max, iPhone 13 Pro and Pro Max, and iPad Pro now feature a built-in lidar scanner that can create 3D representations of close-range objects (up to 5 m away).
Well you do have something for reference, hence the use of the card. Saying that after watching the a video for the app I can't seem it seems too user friendly.
So you either need a reference of an object that has some known size, or you need to deduct the size from the image. One idea I just had that might help you do it is what the iPhone's 4 flash (I'm sure it's very complicated by it might just work for some stuff).
Here's what I think.
When the user wants to measure something, he takes a picture of it, but you're actually taking two separate images, one with flash on, one with flash off. Then you can analyze the lighting differences in the image and the flash reflection to determine the scale of the image. This will only work for close and not too shining objects I guess.
But that's about the only other way I thought about deducting scale from an image without any fixed objects.
I like Ron Srebro's idea and have thought about something similar -- please share if you get it to work!
An alternative approach would be to use the auto-focus feature of the camera. Point-and-shoot camera's often have a laser range finder that they use to auto-focus. iPhone doesn't have this and the f-stop is fixed. However, users can change the focus by tapping the camera screen. The phone can also switch between regular and macro focus.
If the API exposes the current focus settings, maybe there's a way to use this to determine range?
Another solution may be to use two laser pointers.
Basically you would shine two laser pointers at, say, a wall in parallel. Then, the further back you go, the beams will look closer and closer together in the video, but they will still remain the same distance apart. Then you can easily come up with some formula to measure the distance based on how far apart the dots are in the photo.
See this thread for more details: Possible to measure distance with an iPhone and laser pointer?.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With