from pyspark import SparkConf, SparkContext
from pyspark.sql import SQLContext
conf = SparkConf().setAppName("Test").set("spark.driver.memory", "1g")
sc = SparkContext(conf = conf)
sqlContext = SQLContext(sc)
results = sqlContext.sql("/home/ubuntu/workload/queryXX.sql")
When I execute this command using: python test.py it gives me an error. 
y4j.protocol.Py4JJavaError: An error occurred while calling o20.sql. : java.lang.RuntimeException: [1.1] failure: ``with'' expected but `/' found
/home/ubuntu/workload/queryXX.sql
at scala.sys.package$.error(package.scala:27)
I am very new to Spark and I need help here to move forward.
Spark SQL lets you query structured data inside Spark programs, using either SQL or a familiar DataFrame API. Usable in Java, Scala, Python and R. Apply functions to results of SQL queries.
SqlContext.sql expects a valid SQL query not a path to the file. Try this:
with open("/home/ubuntu/workload/queryXX.sql") as fr:
   query = fr.read()
results = sqlContext.sql(query)
                        Run spark-sql --help will give you
CLI options:
 -d,--define <key=value>          Variable subsitution to apply to hive
                                  commands. e.g. -d A=B or --define A=B
    --database <databasename>     Specify the database to use
 -e <quoted-query-string>         SQL from command line
 -f <filename>                    SQL from files
 -H,--help                        Print help information
    --hiveconf <property=value>   Use value for given property
    --hivevar <key=value>         Variable subsitution to apply to hive
                                  commands. e.g. --hivevar A=B
 -i <filename>                    Initialization SQL file
 -S,--silent                      Silent mode in interactive shell
 -v,--verbose                     Verbose mode (echo executed SQL to the
                                  console)
So you can execute your sql script like this:
spark-sql -f <your-script>.sql
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With