This command works with HiveQL:
insert overwrite directory '/data/home.csv' select * from testtable;
But with Spark SQL I'm getting an error with an org.apache.spark.sql.hive.HiveQl
stack trace:
java.lang.RuntimeException: Unsupported language features in query: insert overwrite directory '/data/home.csv' select * from testtable
Please guide me to write export to CSV feature in Spark SQL.
In Spark, you can save (write/extract) a DataFrame to a CSV file on disk by using dataframeObj. write. csv("path") , using this you can also write DataFrame to AWS S3, Azure Blob, HDFS, or any Spark supported file systems.
Spark SQL provides spark. read(). csv("file_name") to read a file or directory of files in CSV format into Spark DataFrame, and dataframe.
You can use below statement to write the contents of dataframe in CSV format df.write.csv("/data/home/csv")
If you need to write the whole dataframe into a single CSV file, then use df.coalesce(1).write.csv("/data/home/sample.csv")
For spark 1.x, you can use spark-csv to write the results into CSV files
Below scala snippet would help
import org.apache.spark.sql.hive.HiveContext // sc - existing spark context val sqlContext = new HiveContext(sc) val df = sqlContext.sql("SELECT * FROM testtable") df.write.format("com.databricks.spark.csv").save("/data/home/csv")
To write the contents into a single file
import org.apache.spark.sql.hive.HiveContext // sc - existing spark context val sqlContext = new HiveContext(sc) val df = sqlContext.sql("SELECT * FROM testtable") df.coalesce(1).write.format("com.databricks.spark.csv").save("/data/home/sample.csv")
Since Spark 2.X
spark-csv
is integrated as native datasource. Therefore, the necessary statement simplifies to (windows)
df.write .option("header", "true") .csv("file:///C:/out.csv")
or UNIX
df.write .option("header", "true") .csv("/var/out.csv")
Notice: as the comments say, it is creating the directory by that name with the partitions in it, not a standard CSV file. This, however, is most likely what you want since otherwise your either crashing your driver (out of RAM) or you could be working with a non distributed environment.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With