Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Understanding ibeacon distancing

The distance estimate provided by iOS is based on the ratio of the beacon signal strength (rssi) over the calibrated transmitter power (txPower). The txPower is the known measured signal strength in rssi at 1 meter away. Each beacon must be calibrated with this txPower value to allow accurate distance estimates.

While the distance estimates are useful, they are not perfect, and require that you control for other variables. Be sure you read up on the complexities and limitations before misusing this.

When we were building the Android iBeacon library, we had to come up with our own independent algorithm because the iOS CoreLocation source code is not available. We measured a bunch of rssi measurements at known distances, then did a best fit curve to match our data points. The algorithm we came up with is shown below as Java code.

Note that the term "accuracy" here is iOS speak for distance in meters. This formula isn't perfect, but it roughly approximates what iOS does.

protected static double calculateAccuracy(int txPower, double rssi) {
  if (rssi == 0) {
    return -1.0; // if we cannot determine accuracy, return -1.
  }

  double ratio = rssi*1.0/txPower;
  if (ratio < 1.0) {
    return Math.pow(ratio,10);
  }
  else {
    double accuracy =  (0.89976)*Math.pow(ratio,7.7095) + 0.111;    
    return accuracy;
  }
}   

Note: The values 0.89976, 7.7095 and 0.111 are the three constants calculated when solving for a best fit curve to our measured data points. YMMV


I'm very thoroughly investigating the matter of accuracy/rssi/proximity with iBeacons and I really really think that all the resources in the Internet (blogs, posts in StackOverflow) get it wrong.

davidgyoung (accepted answer, > 100 upvotes) says:

Note that the term "accuracy" here is iOS speak for distance in meters.

Actually, most people say this but I have no idea why! Documentation makes it very very clear that CLBeacon.proximity:

Indicates the one sigma horizontal accuracy in meters. Use this property to differentiate between beacons with the same proximity value. Do not use it to identify a precise location for the beacon. Accuracy values may fluctuate due to RF interference.

Let me repeat: one sigma accuracy in meters. All 10 top pages in google on the subject has term "one sigma" only in quotation from docs, but none of them analyses the term, which is core to understand this.

Very important is to explain what is actually one sigma accuracy. Following URLs to start with: http://en.wikipedia.org/wiki/Standard_error, http://en.wikipedia.org/wiki/Uncertainty

In physical world, when you make some measurement, you always get different results (because of noise, distortion, etc) and very often results form Gaussian distribution. There are two main parameters describing Gaussian curve:

  1. mean (which is easy to understand, it's value for which peak of the curve occurs).
  2. standard deviation, which says how wide or narrow the curve is. The narrower curve, the better accuracy, because all results are close to each other. If curve is wide and not steep, then it means that measurements of the same phenomenon differ very much from each other, so measurement has a bad quality.

one sigma is another way to describe how narrow/wide is gaussian curve.
It simply says that if mean of measurement is X, and one sigma is σ, then 68% of all measurements will be between X - σ and X + σ.

Example. We measure distance and get a gaussian distribution as a result. The mean is 10m. If σ is 4m, then it means that 68% of measurements were between 6m and 14m.

When we measure distance with beacons, we get RSSI and 1-meter calibration value, which allow us to measure distance in meters. But every measurement gives different values, which form gaussian curve. And one sigma (and accuracy) is accuracy of the measurement, not distance!

It may be misleading, because when we move beacon further away, one sigma actually increases because signal is worse. But with different beacon power-levels we can get totally different accuracy values without actually changing distance. The higher power, the less error.

There is a blog post which thoroughly analyses the matter: http://blog.shinetech.com/2014/02/17/the-beacon-experiments-low-energy-bluetooth-devices-in-action/

Author has a hypothesis that accuracy is actually distance. He claims that beacons from Kontakt.io are faulty beacuse when he increased power to the max value, accuracy value was very small for 1, 5 and even 15 meters. Before increasing power, accuracy was quite close to the distance values. I personally think that it's correct, because the higher power level, the less impact of interference. And it's strange why Estimote beacons don't behave this way.

I'm not saying I'm 100% right, but apart from being iOS developer I have degree in wireless electronics and I think that we shouldn't ignore "one sigma" term from docs and I would like to start discussion about it.

It may be possible that Apple's algorithm for accuracy just collects recent measurements and analyses the gaussian distribution of them. And that's how it sets accuracy. I wouldn't exclude possibility that they use info form accelerometer to detect whether user is moving (and how fast) in order to reset the previous distribution distance values because they have certainly changed.


The iBeacon output power is measured (calibrated) at a distance of 1 meter. Let's suppose that this is -59 dBm (just an example). The iBeacon will include this number as part of its LE advertisment.

The listening device (iPhone, etc), will measure the RSSI of the device. Let's suppose, for example, that this is, say, -72 dBm.

Since these numbers are in dBm, the ratio of the power is actually the difference in dB. So:

ratio_dB = txCalibratedPower - RSSI

To convert that into a linear ratio, we use the standard formula for dB:

ratio_linear = 10 ^ (ratio_dB / 10)

If we assume conservation of energy, then the signal strength must fall off as 1/r^2. So:

power = power_at_1_meter / r^2. Solving for r, we get:

r = sqrt(ratio_linear)

In Javascript, the code would look like this:

function getRange(txCalibratedPower, rssi) {
    var ratio_db = txCalibratedPower - rssi;
    var ratio_linear = Math.pow(10, ratio_db / 10);

    var r = Math.sqrt(ratio_linear);
    return r;
}

Note, that, if you're inside a steel building, then perhaps there will be internal reflections that make the signal decay slower than 1/r^2. If the signal passes through a human body (water) then the signal will be attenuated. It's very likely that the antenna doesn't have equal gain in all directions. Metal objects in the room may create strange interference patterns. Etc, etc... YMMV.


iBeacon uses Bluetooth Low Energy(LE) to keep aware of locations, and the distance/range of Bluetooth LE is 160ft (http://en.wikipedia.org/wiki/Bluetooth_low_energy).


Distances to the source of iBeacon-formatted advertisement packets are estimated from the signal path attenuation calculated by comparing the measured received signal strength to the claimed transmit power which the transmitter is supposed to encode in the advertising data.

A path loss based scheme like this is only approximate and is subject to variation with things like antenna angles, intervening objects, and presumably a noisy RF environment. In comparison, systems really designed for distance measurement (GPS, Radar, etc) rely on precise measurements of propagation time, in same cases even examining the phase of the signal.

As Jiaru points out, 160 ft is probably beyond the intended range, but that doesn't necessarily mean that a packet will never get through, only that one shouldn't expect it to work at that distance.


It's possible, but it depends on the power output of the beacon you're receiving, other rf sources nearby, obstacles and other environmental factors. Best thing to do is try it out in the environment you're interested in.


With multiple phones and beacons at the same location, it's going to be difficult to measure proximity with any high degree of accuracy. Try using the Android "b and l bluetooth le scanner" app, to visualize the signal strengths (distance) variations, for multiple beacons, and you'll quickly discover that complex, adaptive algorithms may be required to provide any form of consistent proximity measurement.

You're going to see lots of solutions simply instructing the user to "please hold your phone here", to reduce customer frustration.