Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spark step on EMR just hangs as "Running" after done writing to S3

Running PySpark 2 job on EMR 5.1.0 as a step. Even after the script is done with a _SUCCESS file written to S3 and Spark UI showing the job as completed, EMR still shows the step as "Running". I've waited for over an hour to see if Spark was just trying to clean itself up but the step never shows as "Completed". The last thing written in the logs is:

INFO MultipartUploadOutputStream: close closed:false s3://mybucket/some/path/_SUCCESS
INFO DefaultWriterContainer: Job job_201611181653_0000 committed.
INFO ContextCleaner: Cleaned accumulator 0

I didn't have this problem with Spark 1.6. I've tried a bunch of different hadoop-aws and aws-java-sdk jars to no avail.

I'm using the default Spark 2.0 configurations so I don't think anything else like metadata is being written. Also the size of the data doesn't seem to have an impact on this problem.

like image 279
Kamil Sindi Avatar asked Nov 18 '16 20:11

Kamil Sindi


People also ask

What action will an EMR cluster configured for step execution take after running a hive program?

When you configure termination after step execution, the cluster starts, runs bootstrap actions, and then runs the steps that you specify. As soon as the last step completes, Amazon EMR terminates the cluster's Amazon EC2 instances.

How do I start EMR cluster after termination?

You can't restart a terminated cluster, but you can clone a terminated cluster to reuse its configuration for a new cluster. For more information, see Cloning a cluster using the console.

How long does it take to start an EMR cluster?

After about a minute, you should see the cluster say Starting Configuring Cluster Software . It may take up to 15 minutes for this step to complete.


1 Answers

If you aren't already, you should close your spark context.

sc.stop()

Also, if you are watching the Spark Web UI via a browser, you should close that as it sometimes keeps the spark context alive. I recall seeing this on the spark dev mailing list, but can't find the jira for it.

like image 173
J Maurer Avatar answered Nov 15 '22 03:11

J Maurer