Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Creating google_logging_project_sink in Terraform doesn't push events to Pub/Sub

I want to create a Log Sink to listen to a specific message in Stack Driver and push the event to a Cloud Pub/Sub, which will trigger a Cloud Function.

Here is a part of my Terraform template.

resource "google_pubsub_topic" "dataflow_events" {
  name = join("-", concat(["dataflow-events", var.environment, terraform.workspace]))
}

resource "google_logging_project_sink" "dataflow_job_completion_sink" {
  name = join("-", concat(["dataflow-job-completion-sink", var.environment, terraform.workspace]))
  destination = "pubsub.googleapis.com/projects/${var.project}/topics/${google_pubsub_topic.dataflow_events.name}"
  filter = "resource.type=dataflow_step AND textPayload=\"Worker pool stopped.\""
}

Terraform version = 0.13.3

This gets deployed without any errors. However, no events are pushed to the Pub/Sub topic.

However, when I create the sink manually (from the Cloud Web Console), it pushes messages to the (same) Pub/Sub topic.

Here are two screenshots of two sinks.

enter image description here enter image description here

Note: Changing unique_writer_identity parameter (either true or false) on both of them doesn't change its behavior. We used unique_writer_identity as true when we created the manual sink and that's why it has a global service account. But setting this to true in Terraform doesn't push messages to Pub/Sub.

Your expertise is highly appreciated.

like image 261
Praneeth Peiris Avatar asked Nov 13 '20 10:11

Praneeth Peiris


People also ask

Which log buckets receive logs from the two default logs router sinks?

For each Cloud project, billing account, folder, and organization, Logging automatically creates two log buckets: _Required and _Default . Logging automatically creates sinks named _Required and _Default that, in the default configuration, route logs to the correspondingly named buckets.

When you want to save all the logs from your domain without 180 days limit what is Google provided option?

For the _Default and user-defined log buckets, you can configure Cloud Logging to retain your logs between 1 day and 3650 days.


1 Answers

Let me answer my own question here. Thanks, @milindu-sanoj-kumarage for the tip.

When we create the Log Sink, it gives a service account that is bound to it.

If you have marked unique_writer_identity as true, it will look like [GENERATED_ID_1]@[GENERATED_ID_2].iam.gserviceaccount.com and it will be serviceAccount:[email protected] if it's set to false.

These are global (reside outside your projects) SAs if you haven't already added to your project.

You need to add (derrive) this service account into your GCP project and give permission to write to the destination.

  • Go to IAM in the IAM & Admin section --> click on Add button.
  • In the New members textbox, type [email protected].
  • Select the Roles you need to add. It was Pub/Sub > Pub/Sub Publisher in this case.
  • Add a condition to allow only a given Pub/Sub (optional)
  • Click on Save

UPDATE: If your organization restricts adding [email protected], you should have the unique_writer_identity = true in your terraform, and then add the unique IAM generated in the IAM window. You can extract this IAM from the Cloud Sink (Log Router) window in the Logging section.

Now, the sink has permission to push events to Pub/Sub. So, whenever it gets out filter text on the logs, it will push it to the given Pub/Sub topic.

like image 178
Praneeth Peiris Avatar answered Oct 06 '22 21:10

Praneeth Peiris