I would like to apply a function with argument to a pandas series: I have found two different solution of SO:
python pandas: apply a function with arguments to a series
and
Passing multiple arguments to apply (Python)
both of them rely on the use of functool.partial
and they works absolutely fine. By the way the new version of Pandas support multiple argument: in any case I do not understand how does it works. Example:
a=pd.DataFrame({'x':[1,2],'y':[10,20]})
a['x'].apply(lambda x,y: x+y, args=(100))
It exits with a:
TypeError: <lambda>() argument after * must be a sequence, not int
The TypeError
is saying that you passed the wrong type to the lambda
function x + y
. It's expecting the args
to be a sequence, but it got an int
. You may have thought that (100)
was a tuple (a sequence), but in python it's the comma that makes a tuple:
In [10]: type((100))
Out[10]: int
In [11]: type((100,))
Out[11]: tuple
So change your last line to
In [12]: a['x'].apply(lambda x, y: x + y, args=(100,))
Out[12]:
0 101
1 102
Name: x, dtype: int64
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With