I've written a python code to take a 2D signal and FFT it, and now I want to extract the frequencies associated with the FFT. The np.fft.fftfreq
fails, giving me the error
File "/usr/lib64/python2.7/site-packages/numpy/fft/helper.py", line 153, in fftfreq
assert isinstance(n,types.IntType) or isinstance(n, integer)
AssertionError
My code is :
import numpy as np
import scipy as sp
import pylab
import sys
import math
filename = sys.argv[1] # Get name of file to open
ifp = open(filename, "r")
ifp.seek(0)
nrows = 0
ncols = 0
nrows = sum(1 for line in ifp) # Sum over all the lines in the file ptr
ifp.seek(0) # Set the fptr back to beginning of file
for line in ifp:
ncols = len(line.split()) #Split and count number of words in a line
if ncols > 0:
break
OrigData = np.zeros([nrows, ncols], dtype=np.float32) #Allocate numpy array
FFTData = np.zeros([nrows, ncols], dtype=complex)
IFFTData = np.zeros([nrows, ncols], dtype=complex)
FreqComp = np.zeros([nrows, ncols], dtype=np.float32)
ii = 0
jj = 0
ifp.seek(0)
for line in ifp:
linedata = line.split()
jj = 0
for el in linedata:
OrigData[ii,jj] = float(el)
jj+=1
ii+=1
ifp.close()
FFTData = np.fft.fft2(OrigData)
FreqComp = np.fft.fftfreq(FFTData, d=2)
#--- Continue with more code ---#
I know that everything else works except the np.fft.fftfreq
line, because I added that in last. How does one extract 2 dimensional frequency components?
You are passing in an invalid parameter: np.fft.fftfreq
takes the size of the signal data as first parameter (an integer) and the timestep as the second parameter. You are passing in an array as the first parameter.
You need to perform an np.fft.fft
on the signal first though.
Hate to point out the obvious, but read np.fft.fftfreq
... the example code is very pretty clear.
Having performed a 2D FFT, you can obtain the sample frequencies along each dimension as follows:
FreqCompRows = np.fft.fftfreq(FFTData.shape[0],d=2)
FreqCompCols = np.fft.fftfreq(FFTData.shape[1],d=2)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With