Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Docker apps logging with Filebeat and Logstash

I have a set of dockerized applications scattered across multiple servers and trying to setup production-level centralized logging with ELK. I'm ok with the ELK part itself, but I'm a little confused about how to forward the logs to my logstashes. I'm trying to use Filebeat, because of its loadbalance feature. I'd also like to avoid packing Filebeat (or anything else) into all my dockers, and keep it separated, dockerized or not.

How can I proceed?

I've been trying the following. My Dockers log on stdout so with a non-dockerized Filebeat configured to read from stdin I do:

docker logs -f mycontainer | ./filebeat -e -c filebeat.yml

That appears to work at the beginning. The first logs are forwarded to my logstash. The cached one I guess. But at some point it gets stuck and keep sending the same event

Is that just a bug or am I headed in the wrong direction? What solution have you setup?

like image 333
Gianluca Avatar asked Oct 30 '15 09:10

Gianluca


People also ask

How do you pull the logs from Docker container to the elk and describe the process?

In the context of shipping logs into ELK, using the syslog logging driver is probably the easiest way to go. You will need to run this per container, and the result is a pipeline of Docker container logs being outputted into your syslog instance.

What is the difference between Logstash and Filebeat?

The important difference between Logstash and Filebeat is their functionalities, and Filebeat consumes fewer resources. But in general, Logstash consumes a variety of inputs, and the specialized beats do the work of gathering the data with minimum RAM and CPU.

Is Logstash necessary with Filebeat?

You can now use filebeat to send logs to elasticsearch directly or logstash (without a logstash agent, but still need a logstash server of course).


2 Answers

Here's one way to forward docker logs to the ELK stack (requires docker >= 1.8 for the gelf log driver):

  1. Start a Logstash container with the gelf input plugin to reads from gelf and outputs to an Elasticsearch host (ES_HOST:port):

    docker run --rm -p 12201:12201/udp logstash \
        logstash -e 'input { gelf { } } output { elasticsearch { hosts => ["ES_HOST:PORT"] } }'
    
  2. Now start a Docker container and use the gelf Docker logging driver. Here's a dumb example:

    docker run --log-driver=gelf --log-opt gelf-address=udp://localhost:12201 busybox \
        /bin/sh -c 'while true; do echo "Hello $(date)"; sleep 1; done'
    
  3. Load up Kibana and things that would've landed in docker logs are now visible. The gelf source code shows that some handy fields are generated for you (hat-tip: Christophe Labouisse): _container_id, _container_name, _image_id, _image_name, _command, _tag, _created.

If you use docker-compose (make sure to use docker-compose >= 1.5) and add the appropriate settings in docker-compose.yml after starting the logstash container:

log_driver: "gelf"
log_opt:
  gelf-address: "udp://localhost:12201"
like image 180
Pete Avatar answered Oct 14 '22 01:10

Pete


Docker allows you to specify the logDriver in use. This answer does not care about Filebeat or load balancing.

In a presentation I used syslog to forward the logs to a Logstash (ELK) instance listening on port 5000. The following command constantly sends messages through syslog to Logstash:

docker run -t -d --log-driver=syslog --log-opt syslog-address=tcp://127.0.0.1:5000 ubuntu /bin/bash -c 'while true; do echo "Hello $(date)"; sleep 1; done'
like image 45
michaelbahr Avatar answered Oct 14 '22 01:10

michaelbahr