and so I needed to know when the android-based phone was rotated the x,y,z directions for a tilt-based program and then I used the well circulated code that uses magnetism and acceleration to find orientation since the orientation detector is unreliable and this was well and good in the x,z planes but lets say a person was on a circularly moving bus holding their phones in a fixed landscape position and then the android registers y-direction movement even though the user didnt move their phones in the y direction and so the program fails but just using acceleration alone also isn/t accurate and i am wondering how to solve this problem of detecting y- motion relative to the user?
public class MotionListener {
String service_name = Context.SENSOR_SERVICE;
SensorManager sensorManager;
Sensor sensor;
public MotionListener(Context context) {
sensorManager = (SensorManager) context.getSystemService(service_name);
SensorManager sm = (SensorManager) context.getSystemService(Context.SENSOR_SERVICE);
Sensor aSensor = sm.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
Sensor mfSensor = sm.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
sm.registerListener(myAccelerometerListener, aSensor, SensorManager.SENSOR_DELAY_GAME);
sm.registerListener(myMagneticFieldListener, mfSensor, SensorManager.SENSOR_DELAY_GAME);
}
float[] accelerometerValues;
float[] magneticFieldValues;
final SensorEventListener myAccelerometerListener = new SensorEventListener() {
public void onSensorChanged(SensorEvent sensorEvent) {
if (sensorEvent.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
accelerometerValues = sensorEvent.values;
float[] values = new float[3];
float[] R = new float[9];
try {
SensorManager.getRotationMatrix(R, null, accelerometerValues, magneticFieldValues);
SensorManager.getOrientation(R, values);
values[0] = (float) ((float) values[0] * 180 / Math.PI);
values[1] = (float) ((float) values[1] * 180 / Math.PI);
values[2] = (float) ((float) values[2] * 180 / Math.PI);
System.out.println((int)values[0] + " " + (int)values[1] + " " + (int)values[2]);
setTiltCoordinates(values);
} catch (NullPointerException e) {
}
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
};
Well, if you are just looking at the accelerometer data, all you have is three values in x, y and z directions. One of these will have the greatest pull, and you can usually take this to mean that is the direction 'down' is.
There can be issues with movement, such as if the user spins on the spot, making the phone feel acceleration that is not due to gravity, if this is great enough, it might start to look that direction is 'down', how ever, you would have to be spinning fast enough to create close to one g of acceleration, which I very much doubt a bus would be able to do.
You next big problem, is that the x, y and z data values are just three perpendicular axis to which each handset could be orientated to differently. On one phone, 'y' (what is normally considered the up and down axis) might be through the screen, with the screen on the zx, plane. Another phone might have y at an angle to the screen.
So, from just accelerometer data you can't tell what direction orientation the phone is. Though, perhaps the Android SDK accounts for this and 'adjusts' the real accelerometer data to allow all phones to see the same data.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With