Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Unable to connect to Google Container Engine

I've updated gcloud to the latest version (159.0.0)

I created a Google Container Engine node, and then followed the instructions in the prompt.

gcloud container clusters get-credentials prod --zone us-west1-b --project myproject

Fetching cluster endpoint and auth data.
kubeconfig entry generated for prod

kubectl proxy
Unable to connect to the server: error executing access token command 
"/Users/me/Code/google-cloud-sdk/bin/gcloud ": exit status 

Any idea why is it not able to connect?

like image 757
skunkwerk Avatar asked Jun 20 '17 03:06

skunkwerk


People also ask

Can t connect to the SSH Google Cloud?

The firewall rule allowing SSH is missing or misconfigured. By default, Compute Engine VMs allow SSH access on port 22. If the default-allow-ssh rule is missing or misconfigured, you won't be able to connect to VMs. To resolve this issue, Check your firewall rules and re-add or reconfigure default-allow-ssh .

Could not SSH into the instance?

ssh) Could not SSH to the instance. It is possible that your SSH key has not propagated to the instance yet. Try running this command again. If you still cannot connect, verify that the firewall and instance are set to accept ssh traffic.

What is the Google Container Engine?

GKE provides a managed environment for deploying, managing, and scaling your containerized applications using Google infrastructure. The GKE environment consists of multiple machines (specifically, Compute Engine instances) grouped together to form a cluster.


2 Answers

Using GKE, update the credentials from the "Kubernetes Engine/Cluster" management worked for me.

The cluster line provides "Connect" button that copy the credentials commands into console. And this refresh the used token. And then kubectl works again. Why my token expired? well, i suppose GCP token are not eternal.

So, the button plays the same command automatically that : gcloud container clusters get-credentials your-cluster ...

Bruno

like image 176
bruno777 Avatar answered Oct 17 '22 01:10

bruno777


You can try to run to see if the config was generated correctly:

kubectl config view 

I had a similar issue when trying to run kubectl commands on a new Kubernetes cluster just created on Google Cloud Platform.

The solution for my case was to activate Google Application Default Credentials.

You can find a link below on how to activate it.

Basically, you need to set an environmental variable to the path of the .json with the credentials from GCP GOOGLE_APPLICATION_CREDENTIALS -> c:\...\..\..Credentials.json exported from Google Cloud https://developers.google.com/identity/protocols/application-default-credentials

I found this solution on a kuberenetes github issue: https://github.com/kubernetes/kubernetes/issues/30617

PS: make sure you have also set the environmental variables for: %HOME% to %USERPROFILE% %KUBECONFIG% to %USERPROFILE%

like image 33
Alexandru Balan Avatar answered Oct 17 '22 01:10

Alexandru Balan