Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I configure docker logging to conditionally send to CloudWatch?

I have the following in a docker-compose.yml

  web:
    image: my_web
    build:
      context: ./
      dockerfile: web.docker
    container_name: my_web
    networks:
      - front
    ports:
      - "80:8080"
    volumes:
      - wwwlogs:/var/logs/www
    env_file:
      - ${SERVICE_ENVIRONMENT}.env
    links:
      - revproxy
    logging:
      driver: awslogs
      options:
        awslogs-group: my-web-group
        awslogs-region: us-east-1
        awslogs-stream-prefix: my-web

This works fine in production, and sends everything off to CloudWatch as expected. However I'm not clear how this is supposed to work when I want to use the same docker file locally (do not send to AWS, just log to STDOUT/STDERR), and in staging (where I want to send to a different awslogs-group/-prefix).

Any thoughts? In general I'm not a fan of having separate docker files for each environment - duplicated code entry increases the likely hood that something will get missed or not maintained properly. But Docker seems to have limited ability to conditionally provision things.

like image 456
DrTeeth Avatar asked Nov 07 '22 00:11

DrTeeth


1 Answers

This is more of limitation in docker that you can't specify multiple logging drivers. It will be more complicated sending to multiple destinations with a single docker-compose file as it's not supported by docker but it's doable.

For example, you can use the Fluentd logging driver and you will have to start a separate sidecar container for Fluentd. Then on your configs, you can create a routing rule based on the environment. You can say dev routes to 'stdout' and prod routes to 'awslogs' using something like the fluentd CloudWatch logs plugin.

This is another example on how to configure Fluentd with docker-compose.

like image 164
Rico Avatar answered Nov 14 '22 00:11

Rico