Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python MemoryError: cannot allocate array memory

I've got a 250 MB CSV file I need to read with ~7000 rows and ~9000 columns. Each row represents an image, and each column is a pixel (greyscale value 0-255)

I started with a simple np.loadtxt("data/training_nohead.csv",delimiter=",") but this gave me a memory error. I thought this was strange since I'm running 64-bit Python with 8 gigs of memory installed and it died after using only around 512 MB.

I've since tried SEVERAL other tactics, including:

  1. import fileinput and read one line at a time, appending them to an array
  2. np.fromstring after reading in the entire file
  3. np.genfromtext
  4. Manual parsing of the file (since all data is integers, this was fairly easy to code)

Every method gave me the same result. MemoryError around 512 MB. Wondering if there was something special about 512MB, I created a simple test program which filled up memory until python crashed:

str = " " * 511000000 # Start at 511 MB
while 1:
    str = str + " " * 1000 # Add 1 KB at a time

Doing this didn't crash until around 1 gig. I also, just for fun, tried: str = " " * 2048000000 (fill 2 gigs) - this ran without a hitch. Filled the RAM and never complained. So the issue isn't the total amount of RAM I can allocate, but seems to be how many TIMES I can allocate memory...

I google'd around fruitlessly until I found this post: Python out of memory on large CSV file (numpy)

I copied the code from the answer exactly:

def iter_loadtxt(filename, delimiter=',', skiprows=0, dtype=float):
    def iter_func():
        with open(filename, 'r') as infile:
            for _ in range(skiprows):
                next(infile)
            for line in infile:
                line = line.rstrip().split(delimiter)
                for item in line:
                    yield dtype(item)
        iter_loadtxt.rowlength = len(line)

    data = np.fromiter(iter_func(), dtype=dtype)
    data = data.reshape((-1, iter_loadtxt.rowlength))
    return data

Calling iter_loadtxt("data/training_nohead.csv") gave a slightly different error this time:

MemoryError: cannot allocate array memory

Googling this error I only found one, not so helpful, post: Memory error (MemoryError) when creating a boolean NumPy array (Python)

As I'm running Python 2.7, this was not my issue. Any help would be appreciated.

like image 538
stevendesu Avatar asked Dec 06 '13 13:12

stevendesu


Video Answer


1 Answers

With some help from @J.F. Sebastian I developed the following answer:

train = np.empty([7049,9246])
row = 0
for line in open("data/training_nohead.csv")
    train[row] = np.fromstring(line, sep=",")
    row += 1

Of course this answer assumed prior knowledge of the number of rows and columns. Should you not have this information before-hand, the number of rows will always take a while to calculate as you have to read the entire file and count the \n characters. Something like this will suffice:

num_rows = 0
for line in open("data/training_nohead.csv")
    num_rows += 1

For number of columns if every row has the same number of columns then you can just count the first row, otherwise you need to keep track of the maximum.

num_rows = 0
max_cols = 0
for line in open("data/training_nohead.csv")
    num_rows += 1
    tmp = line.split(",")
    if len(tmp) > max_cols:
        max_cols = len(tmp)

This solution works best for numerical data, as a string containing a comma could really complicate things.

like image 135
stevendesu Avatar answered Oct 08 '22 23:10

stevendesu