I'm using Apache Spark 1.0.1. I have many files delimited with UTF8 \u0001
and not with the usual new line \n
. How can I read such files in Spark? Meaning, the default delimiter of sc.textfile("hdfs:///myproject/*")
is \n
, and I want to change it to \u0001
.
spark.read.text () method is used to read a text file into DataFrame. like in RDD, we can also use this method to read multiple files at a time, reading patterns matching files and finally reading all files from a directory.
Spark core provides textFile () & wholeTextFiles () methods in SparkContext class which is used to read single and multiple text or csv files into a single Spark RDD. Using this method we can also read all files from a directory and files with a specific pattern.
Spark Read multiline (multiple line) CSV File 1 Sample CSV File With Multiline Records. ... 2 Spark Read CSV File (Default) Let’s Read a CSV file into Spark DataFrame with out any options. ... 3 Spark Read CSV using Multiline Option. ... 4 Load when the multiline record doesn’t have an escape character. ... 5 Conclusion. ...
By default, Spark CSV data source considers file contains records with a comma delimiter. In case if you have another delimiter like pipe character (|) use spark.read.option ("delimiter","|") option.
You can use textinputformat.record.delimiter
to set the delimiter for TextInputFormat
, E.g.,
import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.mapreduce.Job
import org.apache.hadoop.io.{LongWritable, Text}
import org.apache.hadoop.mapreduce.lib.input.TextInputFormat
val conf = new Configuration(sc.hadoopConfiguration)
conf.set("textinputformat.record.delimiter", "X")
val input = sc.newAPIHadoopFile("file_path", classOf[TextInputFormat], classOf[LongWritable], classOf[Text], conf)
val lines = input.map { case (_, text) => text.toString}
println(lines.collect)
For example, my input is a file containing one line aXbXcXd
. The above code will output
Array(a, b, c, d)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With