I am trying to run local jar file with spark-submit which is working perfectly fine. Here is the command-
spark-submit --class "SimpleApp" --master local myProject/target/scala-2.11/simple-project_2.11-1.0.jar
But when I am trying with curl
curl -X POST --data '{
"file": "file:///home/user/myProject/target/scala-2.11/simple-project_2.11-1.0.jar",
"className": "SimpleApp",
}'
-H
"Content-Type: application/json"
http://server:8998/batches
It is throwing error
"requirement failed: Local path /home/user/myProject/target/scala-2.11/simple-project_2.11-1.0.jar cannot be added to user sessions."
Here is my livy.conf file, as some article suggest to change few things.
# What host address to start the server on. By default, Livy will bind to all network interfaces.
livy.server.host = 0.0.0.0
# What port to start the server on.
livy.server.port = 8998
# What spark master Livy sessions should use.
livy.spark.master = local
# What spark deploy mode Livy sessions should use.
livy.spark.deploy-mode = client
# List of local directories from where files are allowed to be added to user sessions. By
# default it's empty, meaning users can only reference remote URIs when starting their
# sessions.
livy.file.local-dir-whitelist= /home/user/.livy-sessions/
Please help me out with this.
Thanks in Advance.
I recently got the solution of local file reading from Apache Livy as I was creating the wrong request with cURL. I just replaced file reading protocol from 'file://' with 'local:/' and that works for me.
curl -X POST --data '{
"file": "local:/home/user/myProject/target/scala-2.11/simple-project_2.11-1.0.jar",
"className": "SimpleApp",
}'
-H
"Content-Type: application/json"
http://server:8998/batches
That was quite a small mistake but still, my jar file cannot be accessed from HDFS.
Thank you all for helping out.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With