Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to get a phone's azimuth with compass readings and gyroscope readings?

I wish to get my phone's current orientation by the following method:

  1. Get the initial orientation (azimuth) first via the getRotationMatrix() and getOrientation().
  2. Add the integration of gyroscope reading over time to it to get the current orientation.

Phone Orientation:

The phone's x-y plane is fixed parallel with the ground plane. i.e., is in a "texting-while-walking" orientation.

"getOrientation()" Returnings:

Android API allows me to easily get the orientation, i.e., azimuth, pitch, roll, from getOrientation().

Please note that this method always returns its value within the range: [0, -PI] and [o, PI].

My Problem:

Since the integration of the gyroscope reading, denoted by dR, may be quite big, so when I do CurrentOrientation += dR, the CurrentOrientation may exceed the [0, -PI] and [o, PI] ranges.

What manipulations are needed so that I can ALWAYS get the current orientation within the the [0, -PI] and [o, PI] ranges?

I have tried the following in Python, but I highly doubt its correctness.

rotation = scipy.integrate.trapz(gyroSeries, timeSeries) # integration
if (headingDirection - rotation) < -np.pi:
    headingDirection += 2 * np.pi
elif (headingDirection - rotation) > np.pi:
    headingDirection -= 2 * np.pi
# Complementary Filter
headingDirection = ALPHA * (headingDirection - rotation) + (1 - ALPHA) * np.mean(azimuth[np.array(stepNo.tolist()) == i])
if headingDirection < -np.pi:
    headingDirection += 2 * np.pi
elif headingDirection > np.pi:
    headingDirection -= 2 * np.pi

Remarks

This is NOT that simple, because it involves the following trouble-makers:

  1. The orientation sensor reading goes from 0 to -PI, and then DIRECTLY JUMPS to +PI and gradually gets back to 0 via +PI/2.
  2. The integration of the gyrocope reading also leads to some trouble. Should I add dR to the orientation or subtract dR.

Do please refer to the Android Documentations first, before giving a confirmed answer.

Estimated answers will not help.

like image 860
Sibbs Gambling Avatar asked Aug 06 '13 07:08

Sibbs Gambling


People also ask

How do phones measure orientation?

The system computes the orientation angles by using a device's geomagnetic field sensor in combination with the device's accelerometer. Using these two hardware sensors, the system provides data for the following three orientation angles: Azimuth (degrees of rotation about the -z axis).

What is magnetic field sensor in android?

The Magnetometer block reads the strength of the magnetic field around an Android™ device. The built-in magnetometer sensor on the Android device measures the magnetic field along the X, Y, and Z axes. The block outputs the magnetic field as a 1-by-3 vector in microtesla (μT).

What is geomagnetic sensor in smartphones?

A system called Pulse uses the magnetic field sensor, or magnetometer, for the compass app in iPhones and Android phones, to receive messages in the form of a varying magnetic field produced by a nearby electromagnet.

What is the difference between motion and position sensors in Android?

Motion sensors by themselves are not typically used to monitor device position, but they can be used with other sensors, such as the geomagnetic field sensor, to determine a device's position relative to the Earth's frame of reference. This section describes many of the most common Android motion sensors.


1 Answers

The orientation sensor actually derives its readings from the real magnetometer and the accelerometer.

I guess maybe this is the source of the confusion. Where is this stated in the documentation? More importantly, does the documentation somewhere explicitly state that the gyro readings are ignored? As far as I know the method described in this video is implemented:

Sensor Fusion on Android Devices: A Revolution in Motion Processing

This method uses the gyros and integrates their readings. This pretty much renders the rest of the question moot; nevertheless I will try to answer it.


The orientation sensor is already integrating the gyro readings for you, that is how you get the orientation. I don't understand why you are doing it yourself.

You are not doing the integration of the gyro readings properly, it is more complicated than CurrentOrientation += dR (which is incorrect). If you need to integrate the gyro readings (I don't see why, the SensorManager is already doing it for you) please read Direction Cosine Matrix IMU: Theory how to do it properly (Equation 17).

Don't try integrating with Euler angles (aka azimuth, pitch, roll), nothing good will come out.

Please use either quaternions or rotation matrices in your computations instead of Euler angles. If you work with rotation matrices, you can always convert them to Euler angles, see

Computing Euler angles from a rotation matrix by Gregory G. Slabaugh

(The same is true for quaternions.) There are (in the non-degenrate case) two ways to represent a rotation, that is, you will get two Euler angles. Pick the one that is in the range you need. (In case of gimbal lock, there are infinitely many Euler angles, see the PDF above). Just promise you won't start using Euler angles again in your computations after the rotation matrix to Euler angles conversion.

It is unclear what you are doing with the complementary filter. You can implement a pretty damn good sensor fusion based on the Direction Cosine Matrix IMU: Theory manuscript, which is basically a tutorial. It's not trivial to do it but I don't think you will find a better, more understandable tutorial than this manuscript.

One thing that I had to discover myself when I implemented sensor fusion based on this manuscript was that the so-called integral windup can occur. I took care of it by bounding the TotalCorrection (page 27). You will understand what I am talking about if you implement this sensor fusion.



UPDATE: Here I answer your questions that you posted in comments after accepting the answer.

I think the compass gives me my current orientation by using gravity and magnetic field, right? Is gyroscope used in the compass?

Yes, if the phone is more or less stationary for at least half a second, you can get a good orientation estimate by using gravity and the compass only. Here is how to do it: Can anyone tell me whether gravity sensor is as a tilt sensor to improve heading accuracy?

No, the gyroscopes are not used in the compass.

Could you please kindly explain why the integration done by me is wrong? I understand that if my phone's pitch points up, euler angle fails. But any other things wrong with my integration?

There are two unrelated things: (i) the integration should be done differently, (ii) Euler angles are trouble because of the Gimbal lock. I repeat, these two are unrelated.

As for the integration: here is a simple example how you can actually see what is wrong with your integration. Let x and y be the axes of the horizontal plane in the room. Get a phone in your hands. Rotate the phone around the x axis (of the room) by 45 degrees, then around the y axis (of the room) by 45 degrees. Then, repeat these steps from the beginning but now rotate around the y axis first, and then around the x axis. The phone ends up in a totally different orientation. If you do the integration according to CurrentOrientation += dR you will see no difference! Please read the above linked Direction Cosine Matrix IMU: Theory manuscript if you want to do the integration properly.

As for the Euler angles: they screw up the stability of the application and it is enough for me not to use them for arbitrary rotations in 3D.

I still don't understand why you are trying to do it yourself, why you don't want to use the orientation estimate provided by the platform. Chances are, you cannot do better than that.

like image 108
Ali Avatar answered Oct 16 '22 08:10

Ali