Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spark Scala Jaas configuration

I’m executing spark code on scala shell using Kafka jars and my intention is to stream messages from Kafka topic. My spark object is created but can anyone help me in how can I pass jaas configuration file while starting the spark shell ? My error point me to missing jaas configuration

like image 583
RData Avatar asked Mar 06 '23 12:03

RData


1 Answers

Assuming you have a spark-kafka.jaas file in the current folder you are running spark-submit from, you pass it as a file, as well as a Driver and Executor option

spark-submit \
 ...
  --files "spark-kafka.jaas#spark-kafka.jaas" \
  --driver-java-options "-Djava.security.auth.login.config=./spark-kafka.jaas" \
  --conf "spark.executor.extraJavaOptions=-Djava.security.auth.login.config=./spark-kafka.jaas"

You might also need to set "security.protocol" within the Spark code's Kafka properties to be one of the supported Kafka SASL protocols

like image 133
OneCricketeer Avatar answered Mar 10 '23 11:03

OneCricketeer