I have following dataframe:
pri sec
TOM AB,CD,EF
JACK XY,YZ
HARRY FG
NICK KY,NY,SD,EF,FR
I need following output with column names as following(based on how many , separated fields exists in column 'sec'):
pri sec sec0 sec1 sec2 sec3 sec4
TOM AB,CD,EF AB CD EF NaN NaN
JACK XY,YZ XY YZ NaN NaN NaN
HARRY FG FG NaN NaN NaN NaN
NICK KY,NY,SD,EF,FR KY NY SD EF ER
Can I get any suggestions?
Split column by delimiter into multiple columnsApply the pandas series str. split() function on the “Address” column and pass the delimiter (comma in this case) on which you want to split the column. Also, make sure to pass True to the expand parameter.
We can use the pandas Series. str. split() function to break up strings in multiple columns around a given separator or delimiter. It's similar to the Python string split() method but applies to the entire Dataframe column.
In the above example, the data frame 'df' is split into 2 parts 'df1' and 'df2' on the basis of values of column 'Weight'. Method 2: Using Dataframe. groupby(). This method is used to split the data into groups based on some criteria.
To slice the columns, the syntax is df. loc[:,start:stop:step] ; where start is the name of the first column to take, stop is the name of the last column to take, and step as the number of indices to advance after each extraction; for example, you can select alternate columns.
Use join
+ split
+ add_prefix
:
df = df.join(df['sec'].str.split(',', expand=True).add_prefix('sec'))
print (df)
pri sec sec0 sec1 sec2 sec3 sec4
0 TOM AB,CD,EF AB CD EF None None
1 JACK XY,YZ XY YZ None None None
2 HARRY FG FG None None None None
3 NICK KY,NY,SD,EF,FR KY NY SD EF FR
And if need NaN
s add fillna
:
df = df.join(df['sec'].str.split(',', expand=True).add_prefix('sec').fillna(np.nan))
print (df)
pri sec sec0 sec1 sec2 sec3 sec4
0 TOM AB,CD,EF AB CD EF NaN NaN
1 JACK XY,YZ XY YZ NaN NaN NaN
2 HARRY FG FG NaN NaN NaN NaN
3 NICK KY,NY,SD,EF,FR KY NY SD EF FR
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With