Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Saving JSON in scala from SparkSQL

I am using Spark SQL for extracting some information from a JSON file. The question is I want to save the result from the SQL analysis into another JSON for plotting it with Plateau or with d3.js. The thing is I don´t know exactly how to do it. Any suggestion?

val inputTable = sqlContext.jsonFile(inputDirectory).cache() inputTable.registerTempTable("inputTable")

val languages = sqlContext.sql("""
        SELECT 
            user.lang, 
            COUNT(*) as cnt
        FROM tweetTable 
        GROUP BY user.lang
        ORDER BY cnt DESC 
        LIMIT 15""")
languages.rdd.saveAsTextFile(outputDirectory + "/lang")
languages.collect.foreach(println)

I don´t mind if I save my data into a .csv file but I don´t know exactly how to do it.

Thanks!

like image 627
lds Avatar asked Dec 07 '25 22:12

lds


1 Answers

It is just

val languagesDF: DataFrame = sqlContext.sql("<YOUR_QUERY>")
languagesDF.write.json("your.json")

You do not need to go back to a RDD.

Still, take care, that your JSON will be split into multiple parts. If that is not your intention, read

  • Save a large Spark Dataframe as a single json file in S3 and
  • Write single CSV file using spark-csv (here for CSV but can easily be adapted to JSON)

on how to circumvent this (if really required). The main point is in using repartition or coalesce.

like image 176
Martin Senne Avatar answered Dec 11 '25 17:12

Martin Senne