How can I get the executor id when running pySpark code? I am aware that with scala I can use SparkEnv.get().executorId()
, but I cannot find the equivalent when using pySpark.
You can use the REST API to query the executors, I have used it in pySparkUtils to find the executor IPs
Boaz
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With