Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Filebeat multiline parsing of Java exception in docker container not working

I'm running Filebeat to ship logs from a Java service which is running in a container. This container has many other services running and the same Filebeat daemon is gathering all the container's logs that are running in the host. Filebeat forwards logs to Logstash which dumps them in Elastisearch.

I'm trying to use Filebeat multiline capabilities to combine log lines from Java exceptions into one log entry using the following Filebeat configuration:

filebeat:
  prospectors:
    # container logs
    -
      paths:
        - "/log/containers/*/*.log"
      document_type: containerlog
      multiline:
        pattern: "^\t|^[[:space:]]+(at|...)|^Caused by:"
        match: after

output:
  logstash:
    hosts: ["{{getv "/logstash/host"}}:{{getv "/logstash/port"}}"]

Example of Java stacktrace that should be aggregated into one event:

This Java stacktrace is a copy from a docker log entry (after running docker logs java_service)

[2016-05-25 12:39:04,744][DEBUG][action.bulk              ] [Set] [***][3] failed to execute bulk item (index) index {[***][***][***], source[{***}}
MapperParsingException[Field name [events.created] cannot contain '.']
    at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parseProperties(ObjectMapper.java:273)
    at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parseObjectOrDocumentTypeProperties(ObjectMapper.java:218)
    at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parse(ObjectMapper.java:193)
    at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parseProperties(ObjectMapper.java:305)
    at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parseObjectOrDocumentTypeProperties(ObjectMapper.java:218)
    at org.elasticsearch.index.mapper.object.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:139)
    at org.elasticsearch.index.mapper.DocumentMapperParser.parse(DocumentMapperParser.java:118)
    at org.elasticsearch.index.mapper.DocumentMapperParser.parse(DocumentMapperParser.java:99)
    at org.elasticsearch.index.mapper.MapperService.parse(MapperService.java:498)
    at org.elasticsearch.cluster.metadata.MetaDataMappingService$PutMappingExecutor.applyRequest(MetaDataMappingService.java:257)
    at org.elasticsearch.cluster.metadata.MetaDataMappingService$PutMappingExecutor.execute(MetaDataMappingService.java:230)
    at org.elasticsearch.cluster.service.InternalClusterService.runTasksForExecutor(InternalClusterService.java:468)
    at org.elasticsearch.cluster.service.InternalClusterService$UpdateTask.run(InternalClusterService.java:772)
    at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:231)
    at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:194)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

Though, with the Filebeat configuration shown above, I'm still seeing each and every line of the stacktrace as one single event in Elasticsearch.

Any idea of what I'm doing wrong? Also note that since I need to ship logs from several files with filebeat, the multiline aggregation cannot be done in the Logstash side.

Versions

FILEBEAT_VERSION 1.1.0

like image 313
gpestana Avatar asked May 25 '16 12:05

gpestana


1 Answers

Stumbled over this problem today as well.

This is working for me (filebeat.yml):

filebeat.prospectors:
- type: log
  multiline.pattern: "^[[:space:]]+(at|\\.{3})\\b|^Caused by:"
  multiline.negate: false
  multiline.match: after
  paths:
   - '/var/lib/docker/containers/*/*.log'
  json.message_key: log
  json.keys_under_root: true
  processors:
  - add_docker_metadata: ~
output.elasticsearch:
  hosts: ["es-client.es-cluster:9200"]

I use Filebeat 6.2.2 to send the Logs directly to Elasticsearch

like image 156
Stephan Avatar answered Nov 15 '22 05:11

Stephan