Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Memory management of large lists in Python

Today I wrote a program using an array/list with 64000000 entries. However, when writing sigma=[1]*64000000 using Python it runs fine, but later on, as the program computes, my Ubuntu freezes - without any reaction to inputs, not even mouse movement. I tried twice and the results are the same.

When implemented in C++, long long sigma[64000000] holds up fine and runs very fast.

Is there any reason that my program would freeze in the middle of running, other than crashes at beginning?

EDIT: To reply to Chris below, my code did not freeze until a couple of loops later.

Thank you all!

For those who interested in seeing the code, this is the program, a brute-force Project Euler 211:

def e211():
ans=0
sigma=[1]*64000000
for i in range(2,64000000):
    j=i;
    if ((j%1000==0) or (j<100)):
        print(j)
    q=i*i
    while j<64000000:
        sigma[j]+=q
        j+=i
for i in range(1,64000000):
    j=int(sqrt(sigma[i]))
    if j*j==sigma[i]:
        ans+=i
if __name__=='__main__':
    print(e211())
like image 241
zw324 Avatar asked Feb 22 '26 17:02

zw324


2 Answers

Python lists are lists of objects. A number in Python is an object in itself, and takes up somewhat more storage than the 64 bits needed to represent a long long in C++. In particular, Python transparently handles numbers larger than 32 bits, which end up taking a lot more space than a simple integer.

You may be find the standard Python array module useful. It provides Python-compatible access to uniform arrays of integers of specified size. (However, I note that it doesn't offer a 64-bit integer type.)

like image 78
Greg Hewgill Avatar answered Feb 25 '26 05:02

Greg Hewgill


range(1,64000000):

This line creates a full list of size 64000000, so now you have two lists of that size in memory. Use xrange instead.

like image 21
Chris Eberle Avatar answered Feb 25 '26 05:02

Chris Eberle