I understand the statements like x[['col1','col2']]
can be used to select columns from a numpy record array. My question is how to perform the same operation on a single row of a record array. Here is what I would like to do:
Python 2.7.7 (default, Jun 2 2014, 01:41:14)
[...]
IPython 2.1.0 -- An enhanced Interactive Python.
[...]
In [1]: import numpy as np
In [2]: x = np.ones(3, dtype=[('a',float),('b',float),('c',int)])
In [3]: x[['b','c']][0]
Out[3]: (1.0, 1)
In [4]: row0 = x[0]
In [5]: row0['b']
Out[5]: 1.0
In [6]: row0['c']
Out[6]: 1
In [7]: row0[['b','c']]
---------------------------------------------------------------------------
IndexError Traceback (most recent call last)
<ipython-input-7-287722bfeeaa> in <module>()
----> 1 row0[['b','c']]
IndexError: invalid index
I hoped that the last statement gives the result (1.0, 1)
, but instead I get the error shown above. How can I extract the specified columns from row0
?
My real application has many more columns, and which columns I need depends on the row I'm in, so I cannot select the columns before the rows. Since the equivalent of row0[['b','c']]
will be executed many times, I would like to avoid using a python loop here.
Part of my problem may be that I don't know the data type of row0
:
In [8]: row0
Out[8]: (1.0, 1.0, 1)
In [9]: row0.dtype
Out[9]: dtype([('a', '<f8'), ('b', '<f8'), ('c', '<i8')])
row0
has a dtype
, so it clearly is not a plain Python tuple. But it seems not to be a numpy array either:
In [10]: type(row0)
Out[10]: numpy.void
The problem is that when you do x[0]
you get a np.void
object with ndim=0
. You can do x[0:1]
instead in order to get a np.ndarray
object with ndim=1
, which will accept the column selection:
r0 = x[0:1]
r0[['b', 'c']]
#array([(1.0, 1)],
# dtype=[('b', '<f8'), ('c', '<i4')])
You can also transform the np.void
into a np.ndarray
in this case doing:
r0 = x[0]
r0 = r0.reshape(-1)
which will accept the column selection.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With