Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Log level as a field for Docker GELF logging driver

I want to get stdout logs from a docker container and send them to ELK stack. So far, I know that there is a GELF logging driver in Docker.

However, I can't figure out how I can parse ERROR, WARNING or DEBUG messages from the message and put them in a new field like log_level in a log message before Docker sends them to ELK.

Log message should be something like:

{
  "client": "127.0.0.1",
  "user": "frank",
  "timestamp": "2000-10-10 13:55:36 -0700",
  "method": "GET",
  "uri": "/apache_pb.gif",
  "protocol": "HTTP/1.0",
  "status": 200,
  "size": 2326,
  "message": "[ERROR] Error connecting to MongoDB",
  "_logLevel" : "ERROR"
}

Which docker added "_logLevel" : "ERROR" before sending to ELK.

Thanks.

like image 484
skynyrd Avatar asked Feb 04 '23 12:02

skynyrd


1 Answers

I think you confuse what docker does for you and what logstash (or potentially logspout) is here for. Docker Gelf driver adds the follwing fields: Hostname – Container ID – Container Name – Image ID - Image Name – created (container creation time) – level (6 for stdout, 3 for stderr, not to be confused with application loglevel). These things are known to Docker. Docker has no idea about your user or client. Those fields are not created by the gelf driver or docker.


To achieve what you want you would have to use a grok filter in logstash:

my messages have the log format:

${date:format=yyyy-MM-dd HH:mm:ss.fff} | ${correlationId} | ${level} | ${callSite} | ${message}

And I run logstash from docker compose like this:

  logstash:
    image: docker.elastic.co/logstash/logstash:5.3.1
    logging:
      driver: "json-file"
    networks:
      - logging
    ports:
      - "12201:12201"
      - "12201:12201/udp"
    entrypoint: logstash -e 'input { gelf { } }
                        filter{
                                grok { 
                                    match=> ["message", "%{SPACE}%{DATESTAMP:timestamp}%{SPACE}\|%{SPACE}%{DATA:correlation_Id}%{SPACE}\|%{SPACE}%{DATA:log_level}%{SPACE}\|%{SPACE}%{DATA:call_site}%{SPACE}\|%{SPACE}%{DATA:message}%{SPACE}$$"]
                                    overwrite => [ "message" ]
                                }
                                date {
                                    locale => "en"
                                    match => ["timestamp", "dd-MM-YYYY HH:mm:ss:SSS"]
                                    target => "@timestamp"
                                    remove_field => [ "timestamp" ]
                                }
                        }
                        output { stdout{ } elasticsearch { hosts => ["http://elasticsearch:9200"] } }'

and here how I run a container that delivers logs in the specified format (all identical except for date):

docker run --log-driver=gelf --log-opt gelf-address=udp://0.0.0.0:12201 ubuntu /bin/sh -c 'while true; do date "+%d-%m-%Y %H:%M:%S:%3N" | xargs printf "%s %s | 51c489da-2ba7-466e-abe1-14c236de54c5 | INFO | HostingLoggerExtensions.RequestFinished    | Request finished in 35.1624ms 200 application/json; charset=utf-8 message end\n"; sleep 1 ; done'

I hope this helps you get started. Make sure that you start the containers creating logs after logstash.

Maybe read the grok documentation for more info.

like image 55
herm Avatar answered Feb 12 '23 15:02

herm