I would like to combine the functionality of numpy's array
with native python's dict
, namely creating a multidimensional array that can be indexed with strings.
For example, I could do this:
dict_2d = {'a': {'x': 1, 'y': 2},
'b': {'x': 3, 'y': 4}}
print dict_2d['a','y'] # returns 2
I know I could do dict_2d['a']['x']
but long term I'd like to be able to treat them like numpy arrays including doing matrix multiplication and such and thats not possible with layered dicts.
Its also not that hard to write up a simple version of the class where I just use the class to convert all the strings to int indexes and then use numpy but I'd like to use something that already exists if possible.
Edit: I don't need incredible performance. I'll be working with maybe 10x10 arrays. My goal is to make writing the code simple and robust. Working with numpy arrays is not really much different than just writing it in Fortran. I've spent enough of my life tracking down Fortran indexing errors...
Use pandas Lets say the file is like this:
test.csv:
Params, Val1, Val2, Val3
Par1,23,58,412
Par2,56,45,123
Par3,47,89,984
So you can do something like this in python:
import pandas as pd
x = pd.read_csv('test.csv', index_col='Params')
x['Val1']['Par3']
47
You may be looking for pandas, which provides handy datatypes that wrap numpy arrays, allowing you do access rows and columns by name instead of just by number.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With