Following code works with Spark 1.5.2 but not with Spark 2.0.0. I am using Java 1.8.
final SparkConf sparkConf = new SparkConf();
sparkConf.setMaster("local[4]"); // Four threads
final JavaSparkContext javaSparkContext = new JavaSparkContext(sparkConf);
final JavaRDD<String> javaRDDLines = javaSparkContext.textFile("4300.txt");
final JavaRDD<String> javaRDDWords = javaRDDLines.flatMap(line -> Arrays.asList(line.split(" ")));
I get following error
Error:(46, 66) java: incompatible types: no instance(s) of type variable(s) T exist so that java.util.List<T> conforms to java.util.Iterator<U>
I am unable to figure out if the Spark API has changed or something else. Please help. Thanks.
In 2.0, FlatMapFunction.call()
returns an Iterator
rather than Iterable
. Try this:
JavaRDD<String> javaRDDWords = javaRDDLines.flatMap(line -> Arrays.asList(line.split(" ")).iterator())
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With