I have a standalone Spark cluster running on a remote server and I'm new to Spark. It appears that there's no authentication scheme protecting the cluster master's (7077) port by default. Anyone can just simply submit their own code to the cluster without any restrictions.
The Spark documentation states that authentication is possible in stand-alone deploy mode using the spark.authenticate.secret
parameter, but doesn't really elaborate how exactly this should be used.
Is it possible to use some sort of shared secret that would prevent any potential attacker from submitting tasks to the cluster? Can anyone explain how exactly that can be configured?
there are 2 parts to enable support of authentication:
on each server in your cluster, add the following config to conf/spark-defaults.conf
:
spark.authenticate.secret SomeSecretKey
when you initialize the spark context, you should add the same config to it as well, ie:
val conf = new SparkConf()
.set("spark.authenticate.secret", "SomeSecretKey")
val sc = new SparkContext(conf)
or if you are using SparkSession:
val spark = SparkSession.builder()
.conf("spark.authenticate.secret", "SomeSecretKey")
.getOrCreate()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With