Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Can't access Google Cloud Datastore from Google Kubernetes Engine cluster

I have a simple application that Gets and Puts information from a Datastore.

It works everywhere, but when I run it from inside the Kubernetes Engine cluster, I get this output:

Error from Get()
rpc error: code = PermissionDenied desc = Request had insufficient authentication scopes.
Error from Put()
rpc error: code = PermissionDenied desc = Request had insufficient authentication scopes.

I'm using the cloud.google.com/go/datastore package and the Go language.

I don't know why I'm getting this error since the application works everywhere else just fine.

Update:

Looking for an answer I found this comment on Google Groups:

In order to use Cloud Datastore from GCE, the instance needs to be configured with a couple of extra scopes. These can't be added to existing GCE instances, but you can create a new one with the following Cloud SDK command:

gcloud compute instances create hello-datastore --project --zone --scopes datastore userinfo-email

Would that mean I can't use Datastore from GKE by default?

Update 2:

I can see that when creating my cluster I didn't enable any permissions (which are disabled for most services by default). I suppose that's what's causing the issue:

Strangely, I can use CloudSQL just fine even though it's disabled (using the cloudsql_proxy container).

like image 736
Daniel Chmielewski Avatar asked Dec 02 '17 23:12

Daniel Chmielewski


1 Answers

So what I learnt in the process of debugging this issue was that:

  • During the creation of a Kubernetes Cluster you can specify permissions for the GCE nodes that will be created.

  • If you for example enable Datastore access on the cluster nodes during creation, you will be able to access Datastore directly from the Pods without having to set up anything else.

  • If your cluster node permissions are disabled for most things (default settings) like mine were, you will need to create an appropriate Service Account for each application that wants to use a GCP resource like Datastore.

  • Another alternative is to create a new node pool with the gcloud command, set the desired permission scopes and then migrate all deployments to the new node pool (rather tedious).

So at the end of the day I fixed the issue by creating a Service Account for my application, downloading the JSON authentication key, creating a Kubernetes secret which contains that key, and in the case of Datastore, I set the GOOGLE_APPLICATION_CREDENTIALS environment variable to the path of the mounted secret JSON key.

This way when my application starts, it checks if the GOOGLE_APPLICATION_CREDENTIALS variable is present, and authenticates Datastore API access based on the JSON key that the variable points to.

Deployment YAML snippet:

  ...
  containers:
  - image: foo
    name: foo
    env:
    - name: GOOGLE_APPLICATION_CREDENTIALS
      value: /auth/credentials.json
    volumeMounts:
    - name: foo-service-account
      mountPath: "/auth"
      readOnly: true
  volumes:
  - name: foo-service-account
    secret:
      secretName: foo-service-account
like image 140
Daniel Chmielewski Avatar answered Sep 29 '22 13:09

Daniel Chmielewski