I'm not able to locate error logs or message's from println
calls in Scala while running jobs on Spark
in EMR
.
Where can I access these?
I'm submitting the Spark job
, written in Scala
to EMR
using script-runner.jar
with arguments --deploy-mode
set to cluster
and --master
set to yarn
. It runs the job fine.
However I do not see my println
statements in the Amazon EMR UI
where it lists "stderr, stdoutetc. Furthermore if my job errors I don't see why it had an error. All I see is this in the
stderr`:
15/05/27 20:24:44 INFO yarn.Client: Application report from ResourceManager:
application identifier: application_1432754139536_0002
appId: 2
clientToAMToken: null
appDiagnostics:
appMasterHost: ip-10-185-87-217.ec2.internal
appQueue: default
appMasterRpcPort: 0
appStartTime: 1432758272973
yarnAppState: FINISHED
distributedFinalState: FAILED
appTrackingUrl: http://10.150.67.62:9046/proxy/application_1432754139536_0002/A
appUser: hadoop
`
With the deploy mode of cluster on yarn the Spark driver and hence the user code executed will be within the Application Master container. It sounds like you had EMR debugging enabled on the cluster so logs should have also pushed to S3. In the S3 location look at task-attempts/<applicationid>/<firstcontainer>/*
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With