I was looking at pandas DataFrame eval method (docs) which I find a nice syntactic sugar and could also help enhancing performances.
This is the example from the docs:
from numpy.random import randn
import pandas as pd
df = pd.DataFrame(randn(10, 2), columns=list('ab'))
df.eval('a + b')
How can I use eval
when there is a space in my column names?
Example:
df = pd.DataFrame(randn(10, 2), columns=["Col 1", "Col 2"])
I tried this:
df.eval('"Col 1" + "Col 2"')
but this gives error:
TypeError: data type "Col 1" not understood
pd.eval('df["Col 1"] + df["Col 2"]')
This keeps the argument to eval as a string but is less clean than the example without spaces in the column names
example:
print(df)
Col 1 Col 2
0 -0.206838 -1.007173
1 -0.762453 1.178220
2 -0.431943 -0.804775
3 0.830659 -0.244472
4 0.111637 0.943254
5 0.206615 0.436250
6 -0.568307 -0.680140
7 -0.127645 -0.098351
8 0.185413 -1.224999
9 0.767931 1.512654
print(pd.eval('df["Col 1"] + df["Col 2"]'))
0 -1.214011
1 0.415768
2 -1.236718
3 0.586188
4 1.054891
5 0.642865
6 -1.248447
7 -0.225995
8 -1.039586
9 2.280585
dtype: float64
EDIT
After some investigation, it looks like the above method works in either python 2.7 or 3.6 if you are using the python engine:
pd.eval('df["Col 1"] + df["Col 2"]', engine='python')
However, this does not give you the performance advantage that the numexpr
engine can provide. In python 2.7, this method works:
pd.eval('df["Col 1"] + df["Col 2"]', engine='numexpr')
but in python 3.6 you get the error ValueError: unknown type str160
.
My guess is that this is because pandas is passing a unicode string to numexpr
in 3.6 but a bytestring in 2.7 . I'm guessing that this problem is related to this issue and maybe this one as well.
You can do this using:
df.eval(df["Col 1"] + df["Col 2"])
But that is kind of going against the purpose of the eval function.
Alternatively, you can rename your columns in order to make them compatible with the eval syntax:
df.columns = df.columns.map(lambda x: x.replace(' ', '_'))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With