We have an existing dataproc estate and we control access using dataproc's predefined roles. We would like to limit the permissions that our userbase have across our GCP projects hence we are replacing use of predefined roles with custom roles.
I have designed a custom role to govern access to dataproc and given it the following permissions:
When a user that has been granted this role submits a job using:
gcloud dataproc jobs submit spark \
--id $DATAPROC_JOB_ID --async \
--project $GCP_PROJECT --region europe-west1 \
--cluster $clusterName \
--class org.apache.spark.examples.SparkPi \
--jars file:///usr/lib/spark/examples/jars/spark-examples.jar \
-- 1000
It fails with error:
ERROR: (gcloud.dataproc.jobs.submit.spark) PERMISSION_DENIED: Not authorized to requested resource.
I'm wondering what permissions it might be missing because the permissions I've given seem to cover everything required to submit a job. Am I missing some permissions?
Figured it out. It needed dataproc.clusters.use
too.
Yes, you are correct.
To submit a dataproc job, it requires both 'dataproc.clusters.use' and 'dataproc.jobs.create' permission.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With