Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

dataproc job submission failing with 'Not authorized to requested resource', what permission is missing?

We have an existing dataproc estate and we control access using dataproc's predefined roles. We would like to limit the permissions that our userbase have across our GCP projects hence we are replacing use of predefined roles with custom roles.

I have designed a custom role to govern access to dataproc and given it the following permissions:

  • dataproc.clusters.delete
  • dataproc.clusters.get
  • dataproc.clusters.list
  • dataproc.jobs.cancel
  • dataproc.jobs.create
  • dataproc.jobs.delete
  • dataproc.jobs.get
  • resourcemanager.projects.get

enter image description here

When a user that has been granted this role submits a job using:

gcloud dataproc jobs submit spark \
  --id $DATAPROC_JOB_ID --async \  
  --project $GCP_PROJECT --region europe-west1 \
  --cluster $clusterName \ 
  --class org.apache.spark.examples.SparkPi \ 
  --jars file:///usr/lib/spark/examples/jars/spark-examples.jar \ 
  -- 1000

It fails with error:

ERROR: (gcloud.dataproc.jobs.submit.spark) PERMISSION_DENIED: Not authorized to requested resource.

I'm wondering what permissions it might be missing because the permissions I've given seem to cover everything required to submit a job. Am I missing some permissions?

like image 678
jamiet Avatar asked Jan 25 '23 20:01

jamiet


2 Answers

Figured it out. It needed dataproc.clusters.use too.

like image 52
jamiet Avatar answered Apr 07 '23 21:04

jamiet


Yes, you are correct.

To submit a dataproc job, it requires both 'dataproc.clusters.use' and 'dataproc.jobs.create' permission.

like image 26
Zhang Bo Avatar answered Apr 07 '23 22:04

Zhang Bo