Given a raw binary representation of a numpy
array, what is the complete set of metadata needed to unambiguously restore the array?
For example,
>>> np.fromstring( np.array([42]).tostring())
array([ 2.07507571e-322])
which is to be expected (with a hindsight, at least): here the I haven't told fromstring
to expect ints, so it goes with the default float.
But it seems to me that just specifying the dtype=np.float64
or similar may or may not be sufficient. For example,
>>> a = np.array([42.])
>>> a.dtype
dtype('float64')
>>> a.dtype.byteorder
'='
which the docs tell me means 'native order'. Meaning, it's going to be interpreted differently on a big-endian and little-endian machines --- or am I missing something simple?
numpy. fromstring() function create a new one-dimensional array initialized from text data in a string.
numpy. string_ is the NumPy datatype used for arrays containing fixed-width byte strings. On the other hand, str is a native Python type and can not be used as a datatype for NumPy arrays*.
To convert a NumPy array (ndarray) to a Python list use ndarray. tolist() function, this doesn't take any parameters and returns a python list for an array. While converting to a list, it converts the items to the nearest compatible built-in Python type.
Method 1 : Here, we can utilize the astype() function that is offered by NumPy. This function creates another copy of the initial array with the specified data type, float in this case, and we can then assign this copy to a specific identifier, which is convertedArray.
sys.byteorder
gives the endianness of the machine.
However, as @J.F.Sebastain, @seberg and @jorgeca have suggested, np.savez
is a better way to go. The help docstring shows
import io
content = io.BytesIO()
np.savez(content, x=x, y=y)
content.seek(0)
which means you could save the string content
to an sqlite database.
Then, when you SELECT this string from the database, it can be re-converted into numpy arrays with
data = np.load(content)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With