Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to redirect entire output of spark-submit to a file

So, I am trying to redirect the output of an apache spark-submit command to text file but some output fails to populate file. Here is the command I am using:

spark-submit something.py > results.txt

I can see the output in the terminal but I do not see it in the file. What am I forgetting or doing wrong here?

Edit:

If I use

spark-submit something.py | less

I can see all the output being piped into less

like image 494
timbram Avatar asked Sep 26 '17 15:09

timbram


2 Answers

spark-submit prints most of it's output to STDERR

To redirect the entire output to one file, you can use:

spark-submit something.py > results.txt 2>&1

Or

spark-submit something.py &> results.txt
like image 117
philantrovert Avatar answered Sep 20 '22 19:09

philantrovert


If you are running the spark-submit on a cluster the logs are stored with the application Id. You can see the logs once the application finishes.

yarn logs --applicationId <your applicationId> > myfile.txt

Should fetch you the log of your job

The applicationId of your job is given when you submit the spark job. You will be able to see that in the console where you are submitting or from the Hadoop UI.

like image 35
Avishek Bhattacharya Avatar answered Sep 23 '22 19:09

Avishek Bhattacharya