I was trying to run the first example.
The source code is:
/*keyWordCount.java */
import org.apache.spark.*;
import org.apache.spark.api.java.*;
import org.apache.spark.api.java.function.*;
import org.apache.spark.rdd.*;
import org.apache.spark.api.java.JavaRDD;
import java.util.*;
public class keyWordCount {
public static void main(String[] args) {
SparkConf conf = new SparkConf().setAppName("keyWordCount");
JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<String> textFile = sc.textFile("output");
JavaRDD<String> dictFile = sc.textFile("keyword");
JavaRDD<String> words = textFile.flatMap(new FlatMapFunction<String, String>() {
@Override public Iterable<String> call(String s) { return Arrays.asList(s.split(" ")); }
});
}
}
When I compile using mvn compile package, the following error keep showing up:
[ERROR] COMPILATION ERROR : [INFO] ------------------------------------------------------------- [ERROR] /home/cyberliem/TestSpark/src/main/java/keyWordCount.java:[16,81] error: is not abstract and does not override abstract method call(String) in FlatMapFunction
[ERROR] /home/cyberliem/TestSpark/src/main/java/keyWordCount.java:[17,39] error: call(String) in cannot implement call(T) in FlatMapFunction
[ERROR] T extends Object declared in interface FlatMapFunction R extends Object declared in interface FlatMapFunction /home/cyberliem/TestSpark/src/main/java/keyWordCount.java:[17,5] error: method does not override or implement a method from a supertype
I'm not sure how to fix this, can anyone give me an ideal why it goes wrong?
Try this one:
JavaRDD<String> words = textFile.flatMap(new FlatMapFunction<String, String>() {
@Override public Iterator<String> call(String s) { return Arrays.asList(s.split(" ")).iterator(); }
});
or even simpler using lambdas:
JavaRDD<String> words = textFile.flatMap(l -> Arrays.asList(l.split(" ")).iterator());
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With