Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Pyspark merge multiple columns into a json column

I asked the question a while back for python, but now I need to do the same thing in PySpark.

I have a dataframe (df) like so:

|cust_id|address    |store_id|email        |sales_channel|category|
-------------------------------------------------------------------
|1234567|123 Main St|10SjtT  |[email protected]|ecom         |direct  |
|4567345|345 Main St|10SjtT  |[email protected]|instore      |direct  |
|1569457|876 Main St|51FstT  |[email protected]|ecom         |direct  |

and I would like to combine the last 4 fields into one metadata field that is a json like so:

|cust_id|address    |metadata                                                                                     |
-------------------------------------------------------------------------------------------------------------------
|1234567|123 Main St|{'store_id':'10SjtT', 'email':'[email protected]','sales_channel':'ecom', 'category':'direct'}   |
|4567345|345 Main St|{'store_id':'10SjtT', 'email':'[email protected]','sales_channel':'instore', 'category':'direct'}|
|1569457|876 Main St|{'store_id':'51FstT', 'email':'[email protected]','sales_channel':'ecom', 'category':'direct'}   |

Here's the code I used to do this in python:

cols = [
    'store_id',
    'store_category',
    'sales_channel',
    'email'
]

df1 = df.copy()
df1['metadata'] = df1[cols].to_dict(orient='records')
df1 = df1.drop(columns=cols)

but I would like to translate this to PySpark code to work with a spark dataframe; I do NOT want to use pandas in Spark.

like image 531
DBA108642 Avatar asked Dec 07 '22 10:12

DBA108642


1 Answers

Use to_json function to create json object!

Example:

from pyspark.sql.functions import *

#sample data
df=spark.createDataFrame([('1234567','123 Main St','10SjtT','[email protected]','ecom','direct')],['cust_id','address','store_id','email','sales_channel','category'])

df.select("cust_id","address",to_json(struct("store_id","category","sales_channel","email")).alias("metadata")).show(10,False)

#result
+-------+-----------+----------------------------------------------------------------------------------------+
|cust_id|address    |metadata                                                                                |
+-------+-----------+----------------------------------------------------------------------------------------+
|1234567|123 Main St|{"store_id":"10SjtT","category":"direct","sales_channel":"ecom","email":"[email protected]"}|
+-------+-----------+----------------------------------------------------------------------------------------+

to_json by passing list of columns:

ll=['store_id','email','sales_channel','category']

df.withColumn("metadata", to_json(struct([x for x in ll]))).drop(*ll).show()

#result
+-------+-----------+----------------------------------------------------------------------------------------+
|cust_id|address    |metadata                                                                                |
+-------+-----------+----------------------------------------------------------------------------------------+
|1234567|123 Main St|{"store_id":"10SjtT","email":"[email protected]","sales_channel":"ecom","category":"direct"}|
+-------+-----------+----------------------------------------------------------------------------------------+
like image 68
notNull Avatar answered Dec 10 '22 00:12

notNull