Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Very large matrices using Python and NumPy

NumPy is an extremely useful library, and from using it I've found that it's capable of handling matrices which are quite large (10000 x 10000) easily, but begins to struggle with anything much larger (trying to create a matrix of 50000 x 50000 fails). Obviously, this is because of the massive memory requirements.

Is there is a way to create huge matrices natively in NumPy (say 1 million by 1 million) in some way (without having several terrabytes of RAM)?

like image 735
Peter Avatar asked Jun 28 '09 00:06

Peter


People also ask

How large a matrix can Python handle?

NumPy is an extremely useful library, and from using it I've found that it's capable of handling matrices which are quite large (10000 x 10000) easily, but begins to struggle with anything much larger (trying to create a matrix of 50000 x 50000 fails).

How do you store large matrices?

If it's a band matrix you can store just the diagonal band. You will have to look to the matrix properties and see where you can save memory. If you can't find any property that allow such optimizations, then you will have to store it on a file.


1 Answers

PyTables and NumPy are the way to go.

PyTables will store the data on disk in HDF format, with optional compression. My datasets often get 10x compression, which is handy when dealing with tens or hundreds of millions of rows. It's also very fast; my 5 year old laptop can crunch through data doing SQL-like GROUP BY aggregation at 1,000,000 rows/second. Not bad for a Python-based solution!

Accessing the data as a NumPy recarray again is as simple as:

data = table[row_from:row_to] 

The HDF library takes care of reading in the relevant chunks of data and converting to NumPy.

like image 119
Stephen Simmons Avatar answered Sep 19 '22 11:09

Stephen Simmons