Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What sensors does ARCore use?

What sensors does ARCore use: single camera, dual-camera, IMU, etc. in a compatible phone?

Also, is ARCore dynamic enough to still work if a sensor is not available by switching to a less accurate version of itself?

like image 430
caansnews Avatar asked Jan 24 '19 22:01

caansnews


1 Answers

Updated: May 10, 2022.

About ARCore and ARKit sensors

Google's ARCore, as well as Apple's ARKit, use a similar set of sensors to track a real-world environment. ARCore can use a single RGB camera along with IMU, what is a combination of an accelerometer, magnetometer and a gyroscope. Your phone runs world tracking at 60fps, while Inertial Measurement Unit operates at 1000Hz. Also, there is one more sensor that can be used in ARCore – iToF camera for scene reconstruction (Apple's name is LiDAR). ARCore 1.25 supports Raw Depth API and Full Depth API.

Read what Google says about it about COM method, built on Camera + IMU:

Concurrent Odometry and Mapping – An electronic device tracks its motion in an environment while building a three-dimensional visual representation of the environment that is used for fixing a drift in the tracked motion.

Here's Google US15595617 Patent: System and method for concurrent odometry and mapping.


  • in 2014...2017 Google tended towards Multicam + DepthCam config (Tango project)
  • in 2018...2020 Google tended to SingleCam + IMU config
  • in 2021 Google returned to Multicam + DepthCam config


We all know that the biggest problem for Android devices is a calibration. iOS devices don't have this issue ('cause Apple controls its own hardware and software). A low quality of calibration leads to errors in 3D tracking, hence all your virtual 3D objects might "float" in a poorly-tracked scene. In case you use a phone without iToF sensor, there's no miraculous button against bad tracking (and you can't switch to a less accurate version of tracking). The only solution in such a situation is to re-track your scene from scratch. However, a quality of tracking is much higher when your device is equipped with ToF camera.

Here are four main rules for good tracking results (if you have no ToF camera):

  1. Track your scene not too fast, not too slow

  2. Track appropriate surfaces and objects

  3. Use well lit environment when tracking

  4. Don't track reflected of refracted objects

  5. Horizontal planes are more reliable than vertical ones


SingleCam config vs MultiCam config

The one of the biggest problems of ARCore (that's ARKit problem too) is an Energy Impact. We understand that the higher frame rate is – the better tracking's results are. But the Energy Impact at 30 fps is HIGH and at 60 fps it's VERY HIGH. Such an energy impact will quickly drain your smartphone's battery (due to an enormous burden on CPU/GPU). So, just imagine that you use 2 cameras for ARCore – your phone must process 2 image sequences at 60 fps in parallel as well as process and store feature points and AR anchors, and at the same time, a phone must simultaneously render animated 3D graphics with Hi-Res textures at 60 fps. That's too much for your CPU/GPU. In such a case, a battery will be dead in 30 minutes and will be as hot as a boiler)). It seems users don't like it because this is not-good AR experience.

like image 114
Andy Jazz Avatar answered Oct 08 '22 10:10

Andy Jazz