Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spark : how to run spark file from spark shell

I am using CDH 5.2. I am able to use spark-shell to run the commands.

  1. How can I run the file(file.spark) which contain spark commands.
  2. Is there any way to run/compile the scala programs in CDH 5.2 without sbt?
like image 443
Ramakrishna Avatar asked Dec 31 '14 06:12

Ramakrishna


People also ask

How do I run Pyspark code in Spark shell?

Go to the Spark Installation directory from the command line and type bin/pyspark and press enter, this launches pyspark shell and gives you a prompt to interact with Spark in Python language. If you have set the Spark in a PATH then just enter pyspark in command line or terminal (mac users).


1 Answers

In command line, you can use

spark-shell -i file.scala 

to run code which is written in file.scala

like image 89
Ziyao Li Avatar answered Sep 22 '22 05:09

Ziyao Li