Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Detecting periodic data from the phone's accelerometer

I am developing an Android app and I am need to detect user context (if walking or driving at minimal)

I am using accelerometer and sum of all axes to detect the accleration vector. It is working pretty well in the way I can see some periodics values while walking. But I need to detect these poeriods programmatically.

Please is there any kind of math function to detect period in set of values? I heard Fourier transformation is usable for that, but I really dont know how to implement it. It looks pretty complicated :)

Please help

like image 418
simekadam Avatar asked Dec 06 '22 18:12

simekadam


2 Answers

The simplest way to detect periodicity of data is autocorrelation. This is also fairly simple to implement. To get the autocorrelation at i you simply multiply each data point of your data with each data point shifted by i. Here is some pseudocode:

for i = 0 to length( data ) do
  autocorrel[ i ] = 0;
  for j = 0 to length( data ) do
     autocorrel[ i ] += data( j ) * data( ( j + i ) mod length( data ) )
  done
done

This will give you an array of values. The highest "periodicity" is at the index with the highes value. This way you can extract any periodic parts (there usually is more than one).

Also I would suggest you do not try implement your own FFT in an application. Although this algorithm is very good for learning, there is much one can do wrong which is hard to test and it is also likely that your implementation will be much slower than those which are already available. If it is possible on your system I would suggest you use the FFTW which is impossible to beat in any respect, when it comes to FFT implementations.

EDIT:

Explanation, why this works even on values which do not repeat exactely:

The usual and fully correct way to calculate the autocorrelation is, to substract the mean from your data. Let's say you have [1, 2, 1.2, 1.8 ]. Then you could extract 1.5 from each sample leaving you with [-.5, .5, -.3, .3 ]. Now if you multiply this with itself at an ofset of zero, negatives will be multiplied by negatives and positives by positives, yielding (-.5)^2 + (.5)^2 + (-.3)^2 + (.3)^2=.68. At an offset of one negatives will be multiplied with positives yielding (-.5)*(.5) + (.5)*(-.3) + (-.3)*(.3) + (.3)*(-.5)=-.64. At an offset of two again negatives will be multiplied by negatives and positives by positives. At offset of three something similar to the situation for an offset of one happens again. As you can see, you get positive values at offsets of 0 and 2 (the periods) and negative values at 1 and 4.

Now to only detect the period it is not necessary to substract the mean. If you just leave the samples as-is, the suqared mean will be added at each addition. Since the same value will be added for each calculated coefficient, the comparison will yield the same results as if you first subtracted the mean. At worst either your datatype might run over (in case you use some kind of integral type), or you might get round off errors when the values start getting to big (in case you use float, usually this is not a problem). In case this happens first substract the mean and try if your results get better.

The strongest drawback of using autocorrelation vs. some kind of fast fourier transformation is the the speed. Autocorelation takes O(n^2) where as a FFT only takes O(n log(n)). In case you need to calculate the period of very long sequences very often, autocorelation might not work in your case.

If you want to know how the fourier transformation works, and what all this stuff about real part, and imaginary part, magnitude and phase (have a look at the code posted by Manu for example) means, I suggest you have a look at this book.

EDIT2:

In most cases data is neither fully periodic nor fully chaotic and aperiodic. Usually your data will be composed of several periodic compenents, with varying strength. A period is a time difference by which you can shift your data to make it similar to itself. The autocorrelation calculates how similar the data is, if you shift it by a certain amount. Thus it gives you the strength of all possible periods. This means, there is not "index of repeating value", because when the data is perfectly periodic, all indexes will repeat. The index with the strongest value, gives you the shift, at which the data is most similar to itself. Thus this index gives a time offset, not an index into your data. In order to understand this, it is important to understand, how a time series can be thought of as being made up of the sum of perfectly periodic functions (sinusoidal base functions).

If you need to detect this for very long time series, it is usually also best to slide a window over your data and just check for the period of this smaller data frame. However you have to be aware that your window will add additional periods to your data, of which you have to be aware.

More in the link I posted in the last edit.

like image 159
LiKao Avatar answered Dec 09 '22 08:12

LiKao


There is also a way to compute the autocorrelation of your data using FFT which reduces the complexity from O(n^2) to O(n log n). The basic idea is you take your periodic sample data, transform it using an FFT, then compute the power spectrum by multiplying each FFT coefficient by its complex conjugate, then take the inverse FFT of the power spectrum. You can find pre-existing code to compute the power spectrum without much difficulty. For example, look at the Moonblink android library. This library contains a JAVA translation of FFTPACK (a good FFT library) and it also has some DSP classes for computing power spectra. An autocorrelation method I have used with success is the McLeod Pitch Method (MPM), the java source code for which is available here. I have edited a method in the class McLeodPitchMethod which allows it to compute the pitch using the FFT-optimized autocorrelation algorithm:

private void normalizedSquareDifference(final double[] data) {
    int n = data.length;
    // zero-pad the data so we get a number of autocorrelation function (acf)
    // coefficients equal to the window size
    double[] fft = new double[2*n];
        for(int k=0; k < n; k++){
        fft[k] = data[k];
    }
    transformer.ft(fft);
    // the output of fft is 2n, symmetric complex
    // multiply first n outputs by their complex conjugates
    // to compute the power spectrum
    double[] acf = new double[n];
    acf[0] = fft[0]*fft[0]/(2*n);
    for(int k=1; k <= n-1; k++){
        acf[k] = (fft[2*k-1]*fft[2*k-1] + fft[2*k]*fft[2*k])/(2*n);
    }
    // inverse transform
    transformerEven.bt(acf);
    // the output of the ifft is symmetric real
    // first n coefficients are positive lag acf coefficients
    // now acf contains acf coefficients
    double[] divisorM = new double[n];
    for (int tau = 0; tau < n; tau++) {
        // subtract the first and last squared values from the previous divisor to get the new one;
        double m = tau == 0 ? 2*acf[0] : divisorM[tau-1] - data[n-tau]*data[n-tau] - data[tau-1]*data[tau-1];
        divisorM[tau] = m;
        nsdf[tau] = 2*acf[tau]/m;
    }
}

Where transformer is a private instance of the FFTTransformer class from the java FFTPACK translation, and transformerEven is a private instance of the FFTTransformer_Even class. A call to McLeodPitchMethod.getPitch() with your data will give a very efficient estimate of the frequency.

like image 21
willpett Avatar answered Dec 09 '22 08:12

willpett