Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to get smooth orientation data in android

I have an app which uses orientation data which works very well using the pre API-8 method of using a Sensor.TYPE_ORIENTAITON. Smoothing that data was relatively easy.

I am trying to update the code to avoid using this deprecated approach. The new standard approach is to replace the single Sensor.TYPE_ORIENTATION with a Sensor.TYPE_ACCELEROMETER and Sensor.TYPE_MAGENTIC_FIELD combination. As that data is received, it is sent (via SensorManager.getRotationMatrix()) to SensorManager.getOrientation(). This (theoretically) returns the same information as Sensor.TYPE_ORIENTATION did (apart from different units and axis orientation).

However, this approach seems to generate data which is much more jittery (ie noisy) than the deprecated method (which still works). So, if you compare the same information on the same device, the deprecated method provides much less noisy data than the current method.

How do I get the actual same (less noisy) data that the deprecated method used to provide?

To make my question a little clearer: I have read various answers on this subject, and I have tried all sorts of filter: simple KF / IIR low pass as you suggest; median filter between 5 and 19 points, but so far I have yet to get anywhere close to the smoothness of the data the phone supplies via TYPE_ORIENTATION.

like image 832
Neil Townsend Avatar asked Jan 08 '15 18:01

Neil Townsend


People also ask

How do you change orientation when retaining data?

Another most common solution to dealing with orientation changes by setting the android:configChanges flag on your Activity in AndroidManifest. xml. Using this attribute your Activities won't be recreated and all your views and data will still be there after orientation change.

How do I manage screen orientation on android?

If you want to manually handle orientation changes in your app you must declare the "orientation" , "screenSize" , and "screenLayout" values in the android:configChanges attributes. You can declare multiple configuration values in the attribute by separating them with a pipe | character.

Which sensor is used for orientation?

For determining a device's orientation, you can use the readings from the device's accelerometer and the geomagnetic field sensor.

What is body sensor in mobile?

These are the sensors that help your smartphone measure the number of steps you take, the direction of your path, and detect tilts when you are playing a game. Some examples of motion sensors are accelerometers, gravity sensors, gyroscopes, and so on. Related: Android Motion Sensor Security Risks and How to Stay Safe.


2 Answers

Apply a low-pass filter to your sensor output.

This is my low-pass filter method:

private static final float ALPHA = 0.5f;
//lower alpha should equal smoother movement
...
private float[] applyLowPassFilter(float[] input, float[] output) {
    if ( output == null ) return input;

    for ( int i=0; i<input.length; i++ ) {
        output[i] = output[i] + ALPHA * (input[i] - output[i]);
    }
    return output;
}

Apply it like so:

float[] mGravity;
float[] mGeomagnetic;
@Override
public void onSensorChanged(SensorEvent event) {
    if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
        mGravity = applyLowPassFilter(event.values.clone(), mGravity);
    if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
        mGeomagnetic = applyLowPassFilter(event.values.clone(), mGeomagnetic);
    if (mGravity != null && mGeomagnetic != null) {
        float R[] = new float[9];
        float I[] = new float[9];

        boolean success = SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic);
        if (success) {
            float orientation[] = new float[3];
            SensorManager.getOrientation(R, orientation);
            azimuth = -orientation[0];
            invalidate();
        }
    }
}

This is obviously code for a compass, remove what you don't need.

Also, take a look at this SE question How to implement low pass filter using java

like image 55
MeetTitan Avatar answered Oct 07 '22 13:10

MeetTitan


It turns out that there is another, not particularly documented, way to get orientation data. Hidden in the list of sensor types is TYPE_ROTATION_VECTOR. So, set one up:

Sensor mRotationVectorSensor = sensorManager.getDefaultSensor(Sensor.TYPE_ROTATION_VECTOR);
sensorManager.registerListener(this, mRotationVectorSensor, SensorManager.SENSOR_DELAY_GAME);

Then:

@Override
public void onSensorChanged(SensorEvent event) {
    final int eventType = event.sensor.getType();

    if (eventType != Sensor.TYPE_ROTATION_VECTOR) return;

    long timeNow            = System.nanoTime();

    float mOrientationData[] = new float[3];
    calcOrientation(mOrientationData, event.values.clone());

    // Do what you want with mOrientationData
}

The key mechanism is going from the incoming rotation data to an orientation vector via a rotation matrix. The slightly frustrating thing is the orientation vector comes from quaternion data in the first place, but I can't see how to get the quaternion delivered direct. (If you ever wondered how quaternions relate to orientatin and rotation information, and why they are used, see here.)

private void calcOrientation(float[] orientation, float[] incomingValues) {
    // Get the quaternion
    float[] quatF = new float[4];
    SensorManager.getQuaternionFromVector(quatF, incomingValues);

    // Get the rotation matrix
    //
    // This is a variant on the code presented in
    // http://www.euclideanspace.com/maths/geometry/rotations/conversions/quaternionToMatrix/
    // which has been altered for scaling and (I think) a different axis arrangement. It
    // tells you the rotation required to get from the between the phone's axis
    // system and the earth's.
    //
    // Phone axis system:
    // https://developer.android.com/guide/topics/sensors/sensors_overview.html#sensors-coords
    //
    // Earth axis system:
    // https://developer.android.com/reference/android/hardware/SensorManager.html#getRotationMatrix(float[], float[], float[], float[])
    //
    // Background information:
    // https://en.wikipedia.org/wiki/Rotation_matrix
    //
    float[][] rotMatF = new float[3][3];
    rotMatF[0][0] = quatF[1]*quatF[1] + quatF[0]*quatF[0] - 0.5f;
    rotMatF[0][1] = quatF[1]*quatF[2] - quatF[3]*quatF[0];
    rotMatF[0][2] = quatF[1]*quatF[3] + quatF[2]*quatF[0];
    rotMatF[1][0] = quatF[1]*quatF[2] + quatF[3]*quatF[0];
    rotMatF[1][1] = quatF[2]*quatF[2] + quatF[0]*quatF[0] - 0.5f;
    rotMatF[1][2] = quatF[2]*quatF[3] - quatF[1]*quatF[0];
    rotMatF[2][0] = quatF[1]*quatF[3] - quatF[2]*quatF[0];
    rotMatF[2][1] = quatF[2]*quatF[3] + quatF[1]*quatF[0];
    rotMatF[2][2] = quatF[3]*quatF[3] + quatF[0]*quatF[0] - 0.5f;

    // Get the orientation of the phone from the rotation matrix
    //
    // There is some discussion of this at
    // http://stackoverflow.com/questions/30279065/how-to-get-the-euler-angles-from-the-rotation-vector-sensor-type-rotation-vecto
    // in particular equation 451.
    //
    final float rad2deg = (float)(180.0 / PI);
    orientation[0] = (float)Math.atan2(-rotMatF[1][0], rotMatF[0][0]) * rad2deg;
    orientation[1] = (float)Math.atan2(-rotMatF[2][1], rotMatF[2][2]) * rad2deg;
    orientation[2] = (float)Math.asin ( rotMatF[2][0])                * rad2deg;
    if (orientation[0] < 0) orientation[0] += 360;
}

This seems to give data very similar in feel (I haven't run numeric tests) to the old TYPE_ORIENTATION data: it was usable for motion control of the device with marginal filtering.

There is also helpful information here, and a possible alternative solution here.

like image 42
Neil Townsend Avatar answered Oct 07 '22 14:10

Neil Townsend