I want to create a Log Sink to listen to a specific message in Stack Driver and push the event to a Cloud Pub/Sub, which will trigger a Cloud Function.
Here is a part of my Terraform template.
resource "google_pubsub_topic" "dataflow_events" {
name = join("-", concat(["dataflow-events", var.environment, terraform.workspace]))
}
resource "google_logging_project_sink" "dataflow_job_completion_sink" {
name = join("-", concat(["dataflow-job-completion-sink", var.environment, terraform.workspace]))
destination = "pubsub.googleapis.com/projects/${var.project}/topics/${google_pubsub_topic.dataflow_events.name}"
filter = "resource.type=dataflow_step AND textPayload=\"Worker pool stopped.\""
}
Terraform version = 0.13.3
This gets deployed without any errors. However, no events are pushed to the Pub/Sub topic.
However, when I create the sink manually (from the Cloud Web Console), it pushes messages to the (same) Pub/Sub topic.
Here are two screenshots of two sinks.
Note: Changing unique_writer_identity
parameter (either true
or false
) on both of them doesn't change its behavior. We used unique_writer_identity
as true
when we created the manual sink and that's why it has a global service account. But setting this to true
in Terraform doesn't push messages to Pub/Sub.
Your expertise is highly appreciated.
For each Cloud project, billing account, folder, and organization, Logging automatically creates two log buckets: _Required and _Default . Logging automatically creates sinks named _Required and _Default that, in the default configuration, route logs to the correspondingly named buckets.
For the _Default and user-defined log buckets, you can configure Cloud Logging to retain your logs between 1 day and 3650 days.
Let me answer my own question here. Thanks, @milindu-sanoj-kumarage for the tip.
When we create the Log Sink, it gives a service account that is bound to it.
If you have marked unique_writer_identity
as true
, it will look like [GENERATED_ID_1]@[GENERATED_ID_2].iam.gserviceaccount.com
and it will be serviceAccount:[email protected]
if it's set to false
.
These are global (reside outside your projects) SAs if you haven't already added to your project.
You need to add (derrive) this service account into your GCP project and give permission to write to the destination.
[email protected]
.UPDATE:
If your organization restricts adding [email protected]
, you should have the unique_writer_identity = true
in your terraform, and then add the unique IAM generated in the IAM window. You can extract this IAM from the Cloud Sink (Log Router) window in the Logging section.
Now, the sink has permission to push events to Pub/Sub. So, whenever it gets out filter text on the logs, it will push it to the given Pub/Sub topic.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With