Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Where are the Spark logs on EMR?

I'm not able to locate error logs or message's from println calls in Scala while running jobs on Spark in EMR.

Where can I access these?

I'm submitting the Spark job, written in Scala to EMR using script-runner.jar with arguments --deploy-mode set to cluster and --master set to yarn. It runs the job fine.

However I do not see my println statements in the Amazon EMR UI where it lists "stderr, stdoutetc. Furthermore if my job errors I don't see why it had an error. All I see is this in thestderr`:

15/05/27 20:24:44 INFO yarn.Client: Application report from ResourceManager: 
 application identifier: application_1432754139536_0002
 appId: 2
 clientToAMToken: null
 appDiagnostics: 
 appMasterHost: ip-10-185-87-217.ec2.internal
 appQueue: default
 appMasterRpcPort: 0
 appStartTime: 1432758272973
 yarnAppState: FINISHED
 distributedFinalState: FAILED
 appTrackingUrl: http://10.150.67.62:9046/proxy/application_1432754139536_0002/A
 appUser: hadoop

`

like image 384
Sean Bollin Avatar asked May 27 '15 23:05

Sean Bollin


1 Answers

With the deploy mode of cluster on yarn the Spark driver and hence the user code executed will be within the Application Master container. It sounds like you had EMR debugging enabled on the cluster so logs should have also pushed to S3. In the S3 location look at task-attempts/<applicationid>/<firstcontainer>/*.

like image 177
ChristopherB Avatar answered Sep 23 '22 22:09

ChristopherB