We have a user that is allowed to SSH into an VM on the Google Cloud Platform.
His key is added to the VM and he can SSH using
gcloud compute ssh name-of-vm
However connecting in this way will always have gcloud try to update project wide meta data
Updating project ssh metadata...failed
It fails because he only has rights for accessing & administrating this VM
However it's very annoying that every time he has to connect in this way he has to to wait for GCP trying to update metadata, which is not allowed and then check the sshkeys on the machine.
Yes we can 'block project wide ssh keys' on the instance, but that would mean that other project admins cannot log in anymore.
I've also tried to minimise access to this user.
Log in to the Google Cloud Console and select your project. Navigate to the “Compute Engine -> VM Instances” page and select the server you wish to connect to. Click the “Edit” link in the top control bar. On the resulting page, copy and paste your public SSH key into the “SSH Keys” field.
To connect to an instance without an external IP address, use the gcloud compute ssh command with the --internal-ip flag. In the Google Cloud console, go to the VM Instances page and find the internal IP address for the instance that you want to connect to.
What you can do is to enable-oslogin for all the users you need including admins, enabling OS Login on instances disables metadata-based SSH key configurations on those instances.
The role to start, stop and connect via SSH to an instance would be roles/compute.instanceAdmin (take in account that this role is currently in beta) you can check here a list of the Compute Engine roles available so you can choose the one that better suits your needs.
To store data into a bucket, I think the most suitable role is roles/storage.objectCreator that allows users to create objects but not to delete or overwrite objects.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With