Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Computing Autocorrelation with FFT Using JTransforms Library

I'm trying to calculate autocorrelation of sample windows in a time series using the code below. I'm applying FFT to that window, then computing magnitudes of real and imaginary parts and setting imaginary part to zero, lastly taking inverse transform of it to obtain autocorrelation:

DoubleFFT_1D fft = new DoubleFFT_1D(magCnt);
fft.realForward(magFFT);

magFFT[0] = (magFFT[0] * magFFT[0]);
for (int i = 1; i < (magCnt - (magCnt%2)) / 2; i++) {
    magFFT[2*i] = magFFT[2*i] * magFFT[2*i] + magFFT[2*i + 1] * magFFT[2*i + 1];
    magFFT[2*i + 1] = 0.0;
}

if (magCnt % 2 == 0) {
    magFFT[1] = (magFFT[1] * magFFT[1]);
} else {
    magFFT[magCnt/2] = (magFFT[magCnt-1] * magFFT[magCnt-1] + magFFT[1] * magFFT[1]);
}

autocorr = new double[magCnt];
System.arraycopy(magFFT, 0, autocorr, 0, magCnt);
DoubleFFT_1D ifft = new DoubleFFT_1D(magCnt);
ifft.realInverse(autocorr, false);

for (int i = 1; i < autocorr.length; i++)
    autocorr[i] /= autocorr[0];
autocorr[0] = 1.0;

The first question is: It is seen that this code maps autocorrelation result to [0,1] range, although correlation is supposed to be between -1 and 1. Of course it is easy to map the results to [-1,1] range, but I'm not sure if this mapping is correct. How can we interpret the values in the resulting autocorr array?

Secondly, with this code I'm geting good results for some periodic series, that is I get higher values for specific autocorrelation indices in accordance with the period of signal. However the result go weird when I apply it to non-periodic signals: all the values in autocorr array appear to be very close to 1. What is the reason for that?

like image 786
mostar Avatar asked Sep 02 '12 19:09

mostar


1 Answers

For FFT-based algorithms to work, you must pay careful attention to the definitions, including conventions of the library you are using. You seem to be confusing the "signal processing" convention for AC and the "statistical" one. And then there is FFT wrapping and zero padding.

Here is a code that's working for the even N case, signal processing convention. It's tested against a brute force wrapped autocorrelation. The comments show how to convert it to the signal processing convention. For statistical ac, the mean of the data is subtracted out. This can be done merely by zeroing out the "0Hz" component of the FFT. Then the zero'th element of the ac is the variance, and you can normalize by dividing through by this quantity. The resulting values will fall in -1..1 as you say.

Your code seems to be doing the dividing through, but not ignoring the 0 Hz component of the data. So it's computing some kind of mashup of the conventions.

import edu.emory.mathcs.jtransforms.fft.DoubleFFT_1D;
import java.util.Arrays;

public class TestFFT {

    void print(String msg, double [] x) {
        System.out.println(msg);
        for (double d : x) System.out.println(d);
    }

    /**
     * This is a "wrapped" signal processing-style autocorrelation. 
     * For "true" autocorrelation, the data must be zero padded.  
     */
    public void bruteForceAutoCorrelation(double [] x, double [] ac) {
        Arrays.fill(ac, 0);
        int n = x.length;
        for (int j = 0; j < n; j++) {
            for (int i = 0; i < n; i++) {
                ac[j] += x[i] * x[(n + i - j) % n];
            }
        }
    }

    private double sqr(double x) {
        return x * x;
    }

    public void fftAutoCorrelation(double [] x, double [] ac) {
        int n = x.length;
        // Assumes n is even.
        DoubleFFT_1D fft = new DoubleFFT_1D(n);
        fft.realForward(x);
        ac[0] = sqr(x[0]);
        // ac[0] = 0;  // For statistical convention, zero out the mean 
        ac[1] = sqr(x[1]);
        for (int i = 2; i < n; i += 2) {
            ac[i] = sqr(x[i]) + sqr(x[i+1]);
            ac[i+1] = 0;
        }
        DoubleFFT_1D ifft = new DoubleFFT_1D(n); 
        ifft.realInverse(ac, true);
        // For statistical convention, normalize by dividing through with variance
        //for (int i = 1; i < n; i++)
        //    ac[i] /= ac[0];
        //ac[0] = 1;
    }

    void test() {
        double [] data = { 1, -81, 2, -15, 8, 2, -9, 0};
        double [] ac1 = new double [data.length];
        double [] ac2 = new double [data.length];
        bruteForceAutoCorrelation(data, ac1);
        fftAutoCorrelation(data, ac2);
        print("bf", ac1);
        print("fft", ac2);
        double err = 0;
        for (int i = 0; i < ac1.length; i++)
            err += sqr(ac1[i] - ac2[i]);
        System.out.println("err = " + err);
    }

    public static void main(String[] args) {
        new TestFFT().test();
    }
}
like image 131
Gene Avatar answered Oct 16 '22 03:10

Gene