Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

fastest approach to read a big ascii file into a numpy array

I have a text file with the size of 1505MB contains float data. The file has about 73000 rows and 1500 columns. I would like to read the content of the file into a numpy array and then perform some analysis on the array but my machine has been getting slow using numpy.readtxt to read the file. What is the fastest way to read this file into an array using python?

like image 633
Dalek Avatar asked Nov 17 '25 11:11

Dalek


2 Answers

You can also use the pandas reader, which is optimized :

In [3]: savetxt('data.txt',rand(10000,100))

In [4]: %time u=loadtxt('data.txt')
Wall time: 7.21 s

In [5]: %time u= read_large_txt('data.txt',' ')
Wall time: 3.45 s

In [6]: %time u=pd.read_csv('data.txt',' ',header=None).values
Wall time: 1.41 s
like image 116
B. M. Avatar answered Nov 20 '25 02:11

B. M.


The following function allocates the right amount of memory needed to read a text file.

def read_large_txt(path, delimiter=None, dtype=None):
    with open(path) as f:
        nrows = sum(1 for line in f)
        f.seek(0)
        ncols = len(f.next().split(delimiter))
        out = np.empty((nrows, ncols), dtype=dtype)
        f.seek(0)
        for i, line in enumerate(f):
            out[i] = line.split(delimiter)
    return out

It allocates the memory by knowing beforehand the number of rows, columns and the data type. You could easily add some extra arguments found in np.loadtxt or np.genfromtxt such as skiprows, usecols and so forth.

Important:

As well observed by @Evert, out[i] = line.split(delimiter) seems wrong, but NumPy converts the string to dtype without requiring additional handling of data types here. There are some limits though.

like image 24
Saullo G. P. Castro Avatar answered Nov 20 '25 02:11

Saullo G. P. Castro



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!