Given a Pandas DataFrame with lists stored in several of the columns, is there a simple way to find the column name which contains the longest list for each row?
For example, with this data:
positive negative neutral
1 [marvel, moral, bold, destiny] [] [view, should]
2 [beautiful] [complicated, need] []
3 [celebrate] [crippling, addiction] [big]
I want to identify "positive" as the column with the longest list for row 1 and "negative" for rows 2 and 3.
I thought I could use str.len()
to calculate the list lengths and idmax()
to get the column names, but can't figure out how to combine them.
Let’s discuss how to get column names in Pandas dataframe. First, let’s create a simple dataframe with nba.csv file. Now let’s try to get the columns name from above dataset. Method #3: column.values method returs an array of index. Method #4: Using tolist () method with values with given the list of columns.
You can see how we can determine a pandas column contains a particular value of DataFrame using Series.Str.contains (). This contains () function is used to test the pattern or regex is conta ined within a string of a Series or Index.
In this article, we are going to discuss how to find maximum value and its index position in columns and rows of a Dataframe. Pandas dataframe.max () method finds the maximum of the values in the object and returns it. If the input is a series, the method will return a scalar which will be the maximum of the values in the series.
Check Column Contains a Value in DataFrame Use in operator on a Series to check if a column contains/exists a string value in a pandas DataFrame. df ['Courses'] returns a Series object with all values from column Courses, pandas.Series.unique will return unique values of the Series object. Uniques are returned in order of appearance.
IIUC:
In [227]: df.applymap(len).idxmax(axis=1)
Out[227]:
0 positive
1 negative
2 negative
dtype: object
>>> df.apply(lambda row: row.apply(len).argmax(), axis=1)
0 positive
1 negative
2 negative
dtype: object
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With