I have a python program that is going to eat a lot of memory, primarily in a dict. This dict will be responsible for assigning a unique integer value to a very large set of keys. As I am working with large matrices, I need a key-to-index correspondence that can also be recovered from (i.e., once matrix computations are complete, I need to map the values back to the original keys).
I believe this amount will eventually surpass available memory. I am wondering how this will be handled with regards to swap space. Perhaps there is a better data structure for this purpose.
You need a database, if the data will exceed memory. The indexing of dictionaries isn't designed for good performance when a dictionary is bigger than memory.
Swap space is a kernel feature and transparant to the user (python).
If you do have a huge dict and don't need all the data at once, you could look at redis which might do what you want. Or maybe not :)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With