I am working on a bitbucket pipeline for pushing image to gc container registry. I have created a service account with Storage Admin role. ([email protected])
gcloud auth activate-service-account --key-file key.json
gcloud config set project mgcp-xxxx
gcloud auth configure-docker --quiet
docker push eu.gcr.io/mgcp-xxxx/image-name
Although that the login is successful, i get: Token exchange failed for project 'mgcp-xxxx'. Caller does not have permission 'storage.buckets.get'. To configure permissions, follow instructions at: https://cloud.google.com/container-registry/docs/access-control
Can anyone advice on what i am missing?
Thanks!
Tag image with registry name This configures the docker push command to push the image to a specific location. The registry name format is: gcr.io/[PROJECT-ID]/[IMAGE] where [PROJECT-ID] is your Google Cloud Platform Console project ID and [IMAGE] is your image's name. You are now ready to push your image to GCR!
Google Container Registry is a private Docker registry running on Google Cloud Storage. It uses the same authentication, storage, and billing as google/docker-registry, without the need to run your own registry.
To push any local image to Container Registry, you need to first tag it with the registry name and then push the image. The very first image that you push to a multi-regional host will create the storage bucket for that hostname in your Google Cloud project.
Google Container Registry is a private storage service for Docker images, used to run containerized apps. It’s used to host images for deployment on other GCP container services, like Cloud Run and Kubernetes Engine. Sorry, the video player failed to load. (Error Code: 100013) What Is Google Container Registry?
When you push an image to a registry that does not exist yet in your project, Container Registry creates a storage bucket. To view the image you pushed: Go to the Cloud Console to view the registry and image. Run gcloud container images list-tags to view the image tag and the automatically-generated digest:
Google Cloud services that typically access Container Registry are configured with default permissions to registries in the same Google Cloud project. If the default permissions don't meet your needs, you must configure the appropriate permissions.
For anyone reading all the way here. The other suggestions here did not help me, however I found that the Cloud Service Build Account role was also required. Then the storage.buckets.get
dissappears.
This is my minimal role (2) setup to push docker images:
The Cloud Service Build Account role however adds many more permissions that simply storage.buckets.get
. The exact permissions can be found here.
note: I am well aware the Cloud Service Build Account role also adds the storage.objects.get
permission. However, adding roles/storage.objectViewer
did not resolve my problem. Regardless of the fact it had the storage.objects.get
permission.
If the above does not work you might have the wrong account active. This can be resolved with:
gcloud auth activate-service-account --key-file key.json
If that does not work you might need to set the docker credential helpers with:
gcloud auth configure-docker --project <project_name>
On one final note. There seemed to be some delay between setting a role and it working via the gcloud
tool. This was however minimal, think of a scope less than a minute.
Cheers
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With