I'm trying to create an AWS ECS task with Terraform which will put logs in a specific log group on CloudWatch. The problem is that container definition is in the JSON file and there is no way for me to map the CloudWatch group name from .tf file to that .json file.
container_definition.json:
[
{
"name": "supreme-task",
"image": "xxxx50690yyyy.dkr.ecr.eu-central-1.amazonaws.com/supreme-task",
"essential": true,
"portMappings": [
{
"containerPort": 5000,
"hostPort": 5000
}
],
"logConfiguration": {
"logDriver": "awslogs",
"options": {
"awslogs-group": "supreme-task-group", <- This needs to be taken from variable.tf file.
"awslogs-region": "eu-central-1",
"awslogs-stream-prefix": "streaming"
}
}
}
]
variable.tf:
variable "ecs_task_definition_name" {
description = "Task definition name."
type = string
default = "supreme-task-def"
}
variable "task_role" {
description = "Name of the task role."
type = string
default = "supreme-task-role"
}
variable "task_execution_role" {
description = "Name of the task execution role."
type = string
default = "supreme-task-exec-role"
}
variable "cloudwatch_group" {
description = "CloudWatch group name."
type = string
default = "supreme-task-group"
}
task definition:
resource "aws_ecs_task_definition" "task_definition" {
family = var.ecs_task_definition_name
requires_compatibilities = ["FARGATE"]
network_mode = "awsvpc"
cpu = 1024
memory = 4096
container_definitions = file("modules/ecs-supreme-task/task-definition.json")
execution_role_arn = aws_iam_role.task_execution_role.name
task_role_arn = aws_iam_role.task_role.name
}
Is there a way to do that? Or maybe this should be done differently?
To create a log groupOpen the CloudWatch console at https://console.aws.amazon.com/cloudwatch/ . In the navigation pane, choose Log groups. Choose Actions, and then choose Create log group. Enter a name for the log group, and then choose Create log group.
You can use CloudWatch Events to run Amazon ECS tasks when certain AWS events occur. In this tutorial, you set up a CloudWatch Events rule that runs an Amazon ECS task whenever a file is uploaded to a certain Amazon S3 bucket using the Amazon S3 PUT operation.
Solved by following @ydaetskcorR's comment.
Made container definition as inline parameter.
container_definitions = <<DEFINITION
[
{
"name": "${var.repository_name}",
"image": "${var.repository_uri}",
"essential": true,
"portMappings": [
{
"containerPort": 5000,
"hostPort": 5000
}
],
"logConfiguration": {
"logDriver": "awslogs",
"options": {
"awslogs-group": "${var.cloudwatch_group}",
"awslogs-region": "eu-central-1",
"awslogs-stream-prefix": "ecs"
}
}
}
]
DEFINITION
If you want to load the container definition as a template to avoid inlining the content in the tf files, then you could:
1- Create the container definition as a template file with variables, just note that the extension would be .tpl
container_definition.tpl
[
{
"name": "supreme-task",
"image": "xxxx50690yyyy.dkr.ecr.eu-central-1.amazonaws.com/supreme-task",
"essential": true,
"portMappings": [
{
"containerPort": 5000,
"hostPort": 5000
}
],
"logConfiguration": {
"logDriver": "awslogs",
"options": {
"awslogs-group": "${cloudwatch_group}",
"awslogs-region": "eu-central-1",
"awslogs-stream-prefix": "streaming"
}
}
}
]
2- Then load the file as a template an inject the variables:
task_definition.tf
data template_file task_definition {
template = file("${path.module}/container_definition.tpl")
vars = {
cloudwatch_group = var.cloudwatch_group
}
}
resource "aws_ecs_task_definition" "task_definition" {
family = var.ecs_task_definition_name
requires_compatibilities = ["FARGATE"]
network_mode = "awsvpc"
cpu = 1024
memory = 4096
container_definitions = data.template_file.task_definition.rendered
execution_role_arn = aws_iam_role.task_execution_role.name
task_role_arn = aws_iam_role.task_role.name
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With