I am looking for a way how to dump nested dictionary containing numpy arrays into JSON file (to keep full logs of my experiments and data in one place).
My dictionary looks like this (the structure may be more nested than the code displayed):
import numpy as np
data = {'foo': {'bar': np.array([1, 2, 3])}, 'goo': np.array([3,5,7]),'fur': {'dur': {'mur': np.array([7,5,8])}}}
At the moment this code fails because numpy array is not serializable:
with open('data.txt','w') as fl:
json.dump(data,fl)
I know it is possible to use tolist() function but I do not know how to walk over the dictionary whilst preserving the structure of data and exchanging np.arrays for list.
I tried getting individual values from the dictionary using recursion but I do not know how to "build the dictionary back". My code at the moment (without json dump):
import numpy as np
def dict_walk(data):
for k, v in data.iteritems():
if isinstance(v, dict):
dict_walk(v)
else:
l = v.tolist()
print l
data = {'foo': {'bar': np.array([1, 2, 3])}, 'goo': np.array([3,5,7]),'fur': {'dur': {'mur': np.array([7,5,8])}}}
dict_walk(data)
You can give json.dump
a default
function; it is called for any data type that JSON doesn't know how to handle:
def default(obj):
if isinstance(obj, np.ndarray):
return obj.tolist()
raise TypeError('Not serializable')
with open('data.txt','w') as fl:
json.dump(data, fl, default=default)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With