I am trying to build a Movie Recommender System Using Apache Spark MLlib.
I have written a code for recommender in java and its working fine when run using spark-submit
command.
My run command looks like this
bin/spark-submit --jars /opt/poc/spark-1.3.1-bin-hadoop2.6/mllib/spark-mllib_2.10-1.0.0.jar --class "com.recommender.MovieLensALSExtended" --master local[4] /home/sarvesh/Desktop/spark-test/recommender.jar /home/sarvesh/Desktop/spark-test/ml-latest-small/ratings.csv /home/sarvesh/Desktop/spark-test/ml-latest-small/movies.csv
Now I want to use my recommender in real world scenario, as a web application in which I can query recommender to give some result.
I want to build a Spring MVC web application which can interact with Apache Spark Context and give me results when asked.
My question is that how I can build an application which interacts with Apache Spark which is running on a cluster. So that when a request comes to controller it should take user query and fetch the same result as the spark-submit
command outputs on console.
As far as I have searched, I found that we can use Spark SQL, integrate with JDBC. But I did not find any good example.
Thanks in advance.
just pass the spark context and session as a bean in Spring
@Bean
public SparkConf sparkConf() {
SparkConf sparkConf = new SparkConf()
.setAppName(appName)
.setSparkHome(sparkHome)
.setMaster(masterUri);
return sparkConf;
}
@Bean
public JavaSparkContext javaSparkContext() {
return new JavaSparkContext(sparkConf());
}
@Bean
public SparkSession sparkSession() {
return SparkSession
.builder()
.sparkContext(javaSparkContext().sc())
.appName("Java Spark Ravi")
.getOrCreate();
}
Similarly for xml based configuration
Fully working code with spring and spark is present here
https://github.com/ravi-code-ranjan/spark-spring-seed-project
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With