If I issue gcloud dataproc clusters list
0 clusters are listed:
$ gcloud dataproc clusters list
Listed 0 items.
However if I specify the region gcloud dataproc clusters list --region europe-west1
I get back a list of clusters:
$ gcloud dataproc clusters list --region europe-west1
NAME WORKER_COUNT STATUS ZONE
mydataproccluster1 2 RUNNING europe-west1-d
mydataproccluster2 2 RUNNING europe-west1-d
I'm guessing that the inability to get a list of clusters without specifying --region
is a consequence of a decision made by my org's administrators however I'm hoping there is a way around it. I can visit https://console.cloud.google.com/ and see a list of all the clusters in the project, can I get the same using gcloud
? Having to visit https://console.cloud.google.com/ just so I can issue gcloud dataproc clusters list --region europe-west1
seems a bit of a limitation.
The underlying regional services are by-design isolated from each other such that there's no single URL that returns the combined list (because that would be a global dependency and failure mode), and unfortunately, at the moment the layout of the gcloud libraries is such that there's no option for specifying a list of regions or shorthand for "all regions" when listing dataproc clusters or jobs.
However, you can work around this by obtaining the list of possible regional stacks from the Compute API:
gcloud compute regions list --format="value(name)" | \
xargs -n 1 gcloud dataproc clusters list --region
The only dataproc region that doesn't match up to one of the Compute regions is the special "global" Dataproc region, which is a separate Dataproc service that spans all compute regions.
For convenience you can also just add global
to a for-loop:
for REGION in global $(gcloud compute regions list --format="value(name)"); do gcloud dataproc clusters list --region ${REGION}; done
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With