How can I read the following JSON structure to spark dataframe using PySpark?
My JSON structure
{"results":[{"a":1,"b":2,"c":"name"},{"a":2,"b":5,"c":"foo"}]}
I have tried with :
df = spark.read.json('simple.json');
I want the output a,b,c as columns and values as respective rows.
Thanks.
Json string variables
If you have json strings as variables then you can do
simple_json = '{"results":[{"a":1,"b":2,"c":"name"},{"a":2,"b":5,"c":"foo"}]}'
rddjson = sc.parallelize([simple_json])
df = sqlContext.read.json(rddjson)
from pyspark.sql import functions as F
df.select(F.explode(df.results).alias('results')).select('results.*').show(truncate=False)
which will give you
+---+---+----+
|a |b |c |
+---+---+----+
|1 |2 |name|
|2 |5 |foo |
+---+---+----+
Json strings as separate lines in a file (sparkContext and sqlContext)
If you have json strings as separate lines in a file then you can read it using sparkContext into rdd[string] as above and the rest of the process is same as above
rddjson = sc.textFile('/home/anahcolus/IdeaProjects/pythonSpark/test.csv')
df = sqlContext.read.json(rddjson)
df.select(F.explode(df['results']).alias('results')).select('results.*').show(truncate=False)
Json strings as separate lines in a file (sqlContext only)
If you have json strings as separate lines in a file then you can just use sqlContext
only. But the process is complex as you have to create schema for it
df = sqlContext.read.text('path to the file')
from pyspark.sql import functions as F
from pyspark.sql import types as T
df = df.select(F.from_json(df.value, T.StructType([T.StructField('results', T.ArrayType(T.StructType([T.StructField('a', T.IntegerType()), T.StructField('b', T.IntegerType()), T.StructField('c', T.StringType())])))])).alias('results'))
df.select(F.explode(df['results.results']).alias('results')).select('results.*').show(truncate=False)
which should give you same as above result
I hope the answer is helpful
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With