Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to detect iPhone movement in space using accelerometer?

I am trying to make an application that would detect what kind of shape you made with your iPhone using accelerometer. As an example, if you draw a circle with your hand holding the iPhone, the app would be able to redraw it on the screen. This could also work with squares, or even more complicated shapes. The only example of application I've seen doing such a thing is AirPaint (http://vimeo.com/2276713), but it doesn't seems to be able to do it in real time.

My first try is to apply a low-pass filter on the X and Y parameters from the accelerometer, and to make a pointer move toward these values, proportionally to the size of the screen. But this is clearly not enought, I have a very low accuracy, and if I shake the device it also makes the pointer move...

Any ideas about that ? Do you think accelerometer data is enought to do it ? Or should I consider using other data, such as the compass ?

Thanks in advance !

like image 384
Thomas Desert Avatar asked Apr 20 '10 11:04

Thomas Desert


4 Answers

OK I have found something that seems to work, but I still have some problems. Here is how I proceed (admiting the device is hold verticaly) :

1 - I have my default x, y, and z values.
2 - I extract the gravity vector from this data using a low pass filter.
3 - I substract the normalized gravity vector from each x, y, and z, and get the movement acceleration.
4 - Then, I integrate this acceleration value with respect to time, so I get the velocity.
5 - I integrate this velocity again with respect to time, and find a position.

All of the below code is into the accelerometer:didAccelerate: delegate of my controller. I am trying to make a ball moving according to the position i found. Here is my code :

NSTimeInterval interval = 0;
NSDate *now = [NSDate date];
if (previousDate != nil)
{
    interval = [now timeIntervalSinceDate:previousDate];
}
previousDate = now;

//Isolating gravity vector
gravity.x = currentAcceleration.x * kFileringFactor + gravity.x * (1.0 - kFileringFactor);
gravity.y = currentAcceleration.y * kFileringFactor + gravity.y * (1.0 - kFileringFactor);
gravity.z = currentAcceleration.z * kFileringFactor + gravity.z * (1.0 - kFileringFactor);
float gravityNorm = sqrt(gravity.x * gravity.x + gravity.y * gravity.y + gravity.z * gravity.z);

//Removing gravity vector from initial acceleration
filteredAcceleration.x = acceleration.x - gravity.x / gravityNorm;
filteredAcceleration.y = acceleration.y - gravity.y / gravityNorm;
filteredAcceleration.z = acceleration.z - gravity.z / gravityNorm;

//Calculating velocity related to time interval
velocity.x = velocity.x + filteredAcceleration.x * interval;
velocity.y = velocity.y + filteredAcceleration.y * interval;
velocity.z = velocity.z + filteredAcceleration.z * interval;

//Finding position
position.x = position.x + velocity.x * interval * 160;
position.y = position.y + velocity.y * interval * 230;

If I execute this, I get quite good values, I mean I can see the acceleration going into positive or negative values according to the movements I make. But when I try to apply that position to my ball view, I can see it is moving, but with a propencity to go more in one direction than the other. This means, for example, if I draw circles with my device, i will see the ball describing curves towards the top-left corner of the screen. Something like that : http://img685.imageshack.us/i/capturedcran20100422133.png/

Do you have any ideas about what is happening ? Thanks in advance !

like image 109
Thomas Desert Avatar answered Nov 08 '22 08:11

Thomas Desert


The problem is that you can't integrate acceleration twice to get position. Not without knowing initial position and velocity. Remember the +C term that you added in school when learning about integration? Well by the time you get to position it is a ct+k term. And it is is significant. That's before you consider that the acceleration data you're getting back is quantised and averaged, so you're not actually integrating the actual acceleration of the device. Those errors will end up being large when integrated twice.

Watch the AirPaint demo closely and you'll see exactly this happening, the shapes rendered are significantly different to the shapes moved.

Even devices that have some position and velocity sensing (a Wiimote, for example) have trouble doing gesture recognition. It is a tricky problem that folks pay good money (to companies like AILive, for example) to solve for them.

Having said that, you can probably quite easily distinguish between certain types of gesture, if their large scale characteristics are different. A circle can be detected if the device has received accelerations in each of six angle ranges (for example). You could detect between swiping the iphone through the air and shaking it.

To tell the difference between a circle and a square is going to be much more difficult.

like image 31
Ian Avatar answered Nov 08 '22 08:11

Ian


You need to look up how acceleration relates to velocity and velocity to position. My mind is having a wee fart at the moment, but I am sure it the integral... you want to intergrate acceleration with respect to time. Wikipedia should help you with the maths and I am sure there is a good library somewhere that can help you out.

Just remember though that the accelerometers are not perfect nor polled fast enough. Really sudden movements may not be picked up that well. But for gently drawing in the air, it should work fine.

like image 2
thecoshman Avatar answered Nov 08 '22 08:11

thecoshman


Seems like you are normalizing your gravity vector before subtraction with the instantaneous acceleration. This would keep the relative orientation but remove any relative scale. The latest device I tested (admittedly not an Idevice) returned gravity at roughly -9.8 which is probably calibrated to m/s. Assuming no other acceleration, if you were to normalize this then subtract it from the filtered pass, you would end up with a current accel of -8.8 instead of 0.0f;

2 options: -You can just subtract out the gravity vector after the filter pass -Capture the initial accel vector length, normalize the accel and the gravity vectors, scale the accel vector by the dot of the accel and gravity normals.

Also worth remembering to take the orientation of the device into account.

like image 2
ScottP Avatar answered Nov 08 '22 08:11

ScottP