Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

User Google Cloud credentials inside ephemeral container?

We use Docker containers for most of our work, including development on our own machines. These are ephemeral (started each time we run a test, for example).

For AWS, the auth is easy - we have our keys in our environment, and those are passed through to the container.

We're starting to use Google Cloud services, and the auth path seems harder than AWS. When doing local development, gcloud auth login works well. But when working in an ephemeral container, the login process would be needed each time, and I haven't found a way of persisting user credentials using either a) environment variables or b) mapping volumes - which are the two ways of passing data to containers.

From what I can read, the only path is to use service accounts. But I think then everyone needs their own service account, and needs to be constantly updating that account's permissions to be aligned with their own.

Is there a better way?

like image 642
Maximilian Avatar asked Feb 17 '17 20:02

Maximilian


People also ask

Where are GCP credentials stored?

When gsutil is installed/used via the Cloud SDK ("gcloud"), credentials are stored by Cloud SDK in a non-user-editable file located under ~/. config/gcloud (any manipulation of credentials should be done via the gcloud auth command).

What is ephemeral storage in GCP?

Ephemeral storage types (for example, emptyDir) do not persist after the Pod ceases to exist. These types are useful for scratch space for applications. You can manage your local ephemeral storage resources as you do your CPU and memory resources. Other volume types are backed by durable storage. Persistent volumes.


2 Answers

The easiest for making a local container see the gcloud credentials might be mapping the file system location of the application default credentials into the container.

First, do

gcloud auth application-default login

Then, run your container as

docker run -ti -v=$HOME/.config/gcloud:/root/.config/gcloud test

This should work. I tried it with a Dockerfile like

FROM node:4
RUN npm install --save @google-cloud/storage
ADD test.js .
CMD node ./test.js

and the test.js file like

var storage = require('@google-cloud/storage');
var gcs = storage({
    projectId: 'my-project-515',
});

var bucket = gcs.bucket('my-bucket');
bucket.getFiles(function(err, files) {
  if (err) {
    console.log("failed to get files: ", err)
  } else {
    for (var i in files) {
      console.log("file: ", files[i].name)
    }
  }
})

and it worked as expected.

like image 149
Alexey Alexandrov Avatar answered Oct 19 '22 02:10

Alexey Alexandrov


I had the same issue, but I was using docker-compose. This was solved with adding following to docker-compose.yml:

    volumes:
      - $HOME/.config/gcloud:/root/.config/gcloud
like image 42
Vojtěch Avatar answered Oct 19 '22 02:10

Vojtěch