Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Autonomous robot location detection indoors

I've made a robot controlled by Arduino and Processing, that moves in a room by rotating itself (like a sphere).

What I need is to be able to get the new location once it moves on the floor (let's say within a 3m x 3m room). I'm using a 9DOF sensor (3 axes of accelerometer data, 3 axes gyroscopic, and 3 axes of magnetic data) to determine its roll, pitch and yaw and also its direction.

How is it possible to identify accurately the location of the robot in Cartesian (x,y,z) coordinates relative to its starting position? I cannot use a GPS since the movement is less that 20cm per rotation and the robot will be used indoors.

I found some indoor ranging and 3D positioning solutions like pozyx or by using a fixed camera. However I need it to be cost efficient.

Is there any way to convert the 9DOF data to get the new location or any other sensor to do that? Any other solution such as an algorithm?

like image 444
Apollon1954 Avatar asked Aug 19 '15 21:08

Apollon1954


People also ask

What sensors are used in autonomous robots?

Types of sensors Proprioceptive sensors deal with robot itself, such as accelerometers, gyroscope, magnetometer and compass, wheel encoders and temperature sensors.

How do autonomous robots navigate?

In local navigation techniques, sensors are usually employed to control the orientation and position of robot. For such use, LIDAR sensor is frequently used for automation purpose. LIDAR works independently as compared to GPS system; therefore, it has the capability of mapping the environment.

How do I localize my mobile robot?

For localization, the robot utilizes its exteroceptive sensors such as laser sensor, vision and ultrasonic sensor to make observation about its environment. The sensors information can be combined with robot's odometry to localize the robot.


2 Answers

As one points out in the comments, integrating acceleration gives velocity, and integrating this again gives position. This is however not very accurate as errors will accumulate in no time.

Instead what people are using is to use "sensor fusion", which combines the data of several sensors into a better estimate of e.g. the position. It will however still accumulate error over time if you rely on the accelerometer and gyro alone. The magnetic vector will however help you, but it will probably still be inaccurate.

I have found the following guide online that gives an introduction to sensor fusion with kalmann filters on an arduino.

http://digitalcommons.calpoly.edu/cgi/viewcontent.cgi?article=1114&context=aerosp

Warning: you need to know some math to get this up and running.

like image 135
Sigurd V Avatar answered Oct 20 '22 16:10

Sigurd V


My following answer does not include a specific implementation and my expertise does not include robotics. (I'm a researcher in machine learning, NLP, AI field.) However, I believe my lack-of-detail suggestion would be somehow useful because your problem setting still remains at a general level.

SLAM is one of the most famous field which study how to estimate sequential robot locations by sensory-motur data. In the field, there are a lot of studies to estimate robot's locations by sensory-motor data.

Researchers have studied SLAM methods for various specific situations like in slippy floor and complex shape room or with noisy sensor, etc. I think your current setting is a little less specific than one in those researches.

So, If I were you, I would start by trying some standard method of SLAM. I would pick up several popular and general methods from a textbook of SLAM, and look for open source software implementing these methods.

As far as I know, the particle filter(PF) is one of the most popular and successful method in SLAM field. PF is a advanced variance of Kalman filter(KF). PF is very easy to implement. Math is much simpler than KF. I think PF is worth trying in your situation.

like image 45
Light Yagmi Avatar answered Oct 20 '22 16:10

Light Yagmi