I'm trying to load a file that represents a 2D matrix with numpy loadtxt
cov = np.loadtxt("cov.csv")
The first element is in scientific notation which causes a failure
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-80-0796abd8c0f7> in <module>()
----> 1 cov = np.loadtxt("cov.csv")
C:\home\Anaconda3\lib\site-packages\numpy\lib\npyio.py in loadtxt(fname, dtype, comments, delimiter, converters, skiprows, usecols, unpack, ndmin)
858
859 # Convert each value according to its column and store
--> 860 items = [conv(val) for (conv, val) in zip(converters, vals)]
861 # Then pack it according to the dtype's nesting
862 items = pack_items(items, packing)
C:\home\Anaconda3\lib\site-packages\numpy\lib\npyio.py in <listcomp>(.0)
858
859 # Convert each value according to its column and store
--> 860 items = [conv(val) for (conv, val) in zip(converters, vals)]
861 # Then pack it according to the dtype's nesting
862 items = pack_items(items, packing)
ValueError: could not convert string to float: b'5.0e-7,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0'
I couldn't find any documentation on how to make this work.
numpy version: 1.9.2
python version: 3.4.3 |Anaconda 2.3.0 (32-bit)| (default, Mar 6 2015, 12:08:17) [MSC v.1600 32 bit (Intel)]
To read CSV data into a record in a Numpy array you can use the Numpy library genfromtxt() function, In this function's parameter, you need to set the delimiter to a comma. The genfromtxt() function is used quite frequently to load data from text files in Python.
loadtxt() function. The loadtxt() function is used to load data from a text file. Each row in the text file must have the same number of values.
Use the numpy. savetxt() Function to Save a NumPy Array in a CSV File. The savetxt() function from the numpy module can save an array to a text file. We can specify the file format, delimiter character, and many other arguments to get the final result in our desired format.
It's not scientific notation that's the problem; it's that the loadtxt
default delimiter is any whitespace. You're reading a csv, so specify delimiter=","
:
>>> np.loadtxt("cov.csv")
Traceback (most recent call last):
File "<ipython-input-1-6fdfa7467ef5>", line 1, in <module>
np.loadtxt("cov.csv")
File "/usr/local/lib/python3.4/dist-packages/numpy/lib/npyio.py", line 928, in loadtxt
items = [conv(val) for (conv, val) in zip(converters, vals)]
File "/usr/local/lib/python3.4/dist-packages/numpy/lib/npyio.py", line 928, in <listcomp>
items = [conv(val) for (conv, val) in zip(converters, vals)]
File "/usr/local/lib/python3.4/dist-packages/numpy/lib/npyio.py", line 659, in floatconv
return float(x)
ValueError: could not convert string to float: b'5.0e-7,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0'
>>> np.loadtxt("cov.csv", delimiter=",")
array([ 5.00000000e-07, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00])
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With