So I have a terraform script that creates instances in Google Cloud Platform, I want to be able to have my terraform script also add my ssh key to the instances I create so that I can provision them through ssh. Here is my current terraform script.
#PROVIDER INFO provider "google" { credentials = "${file("account.json")}" project = "myProject" region = "us-central1" } #MAKING CONSUL SERVERS resource "google_compute_instance" "default" { count = 3 name = "a-consul${count.index}" machine_type = "n1-standard-1" zone = "us-central1-a" disk { image = "ubuntu-1404-trusty-v20160627" } # Local SSD disk disk { type = "local-ssd" scratch = true } network_interface { network = "myNetwork" access_config {} } }
What do I have to add to this to have my terraform script add my ssh key /Users/myUsername/.ssh/id_rsa.pub
?
Open the drop down next to SSH and select the option you want to use to SSH into GCP VM Instance. Select the option `Open in browser window`. A window will open up showing that a connection is being set up. Your public keys are transferred to the remote instance and an SSH over HTTP session is established.
I think something like this should work:
metadata = { ssh-keys = "${var.gce_ssh_user}:${file(var.gce_ssh_pub_key_file)}" }
https://cloud.google.com/compute/docs/instances/adding-removing-ssh-keys describes the metadata mechanism, and I found this example at https://github.com/hashicorp/terraform/issues/6678
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With