Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Serilog logs collected by Fluentbit to Elasticsearch in kubernetes doesnt get Json-parsed correctly

Using the EFK Stack on Kubernetes (Minikube). Have an asp.net core app using Serilog to write to console as Json. Logs DO ship to Elasticsearch, but they arrive unparsed strings, into the "log" field, this is the problem.

This is the console output:

{
    "@timestamp": "2019-03-22T22:08:24.6499272+01:00",
    "level": "Fatal",
    "messageTemplate": "Text: {Message}",
    "message": "Text: \"aaaa\"",
    "exception": {
        "Depth": 0,
        "ClassName": "",
        "Message": "Boom!",
        "Source": null,
        "StackTraceString": null,
        "RemoteStackTraceString": "",
        "RemoteStackIndex": -1,
        "HResult": -2146232832,
        "HelpURL": null
    },
    "fields": {
        "Message": "aaaa",
        "SourceContext": "frontend.values.web.Controllers.HomeController",
        "ActionId": "0a0967e8-be30-4658-8663-2a1fd7d9eb53",
        "ActionName": "frontend.values.web.Controllers.HomeController.WriteTrace (frontend.values.web)",
        "RequestId": "0HLLF1A02IS16:00000005",
        "RequestPath": "/Home/WriteTrace",
        "CorrelationId": null,
        "ConnectionId": "0HLLF1A02IS16",
        "ExceptionDetail": {
            "HResult": -2146232832,
            "Message": "Boom!",
            "Source": null,
            "Type": "System.ApplicationException"
        }
    }
}

This is the Program.cs, part of Serilog config (ExceptionAsObjectJsonFormatter inherit from ElasticsearchJsonFormatter):

.UseSerilog((ctx, config) =>
{
    var shouldFormatElastic = ctx.Configuration.GetValue<bool>("LOG_ELASTICFORMAT", false);
    config
        .ReadFrom.Configuration(ctx.Configuration) // Read from appsettings and env, cmdline
        .Enrich.FromLogContext()
        .Enrich.WithExceptionDetails();

    var logFormatter = new ExceptionAsObjectJsonFormatter(renderMessage: true);
    var logMessageTemplate = "[{Timestamp:HH:mm:ss} {Level:u3}] {Message:lj}{NewLine}{Exception}";

    if (shouldFormatElastic)
        config.WriteTo.Console(logFormatter, standardErrorFromLevel: LogEventLevel.Error);
    else
        config.WriteTo.Console(standardErrorFromLevel: LogEventLevel.Error, outputTemplate: logMessageTemplate);

})

Using these nuget pkgs:

  • Serilog.AspNetCore
  • Serilog.Exceptions
  • Serilog.Formatting.Elasticsearch
  • Serilog.Settings.Configuration
  • Serilog.Sinks.Console

This is how it looks like in Kibana Here

And this is configmap for fluent-bit:

fluent-bit-filter.conf:
[FILTER]
    Name                kubernetes
    Match               kube.*
    Kube_URL            https://kubernetes.default.svc:443
    Kube_CA_File        /var/run/secrets/kubernetes.io/serviceaccount/ca.crt
    Kube_Token_File     /var/run/secrets/kubernetes.io/serviceaccount/token
    Merge_Log           On
    K8S-Logging.Parser  On
    K8S-Logging.Exclude On

fluent-bit-input.conf:
[INPUT]
    Name             tail
    Path             /var/log/containers/*.log
    Parser           docker
    Tag              kube.*
    Refresh_Interval 5
    Mem_Buf_Limit    5MB
    Skip_Long_Lines  On

fluent-bit-output.conf:

[OUTPUT]
    Name  es
    Match *
    Host  elasticsearch
    Port  9200
    Logstash_Format On
    Retry_Limit False
    Type  flb_type
    Time_Key @timestamp
    Replace_Dots On
    Logstash_Prefix kubernetes_cluster




fluent-bit-service.conf:
[SERVICE]
    Flush        1 
    Daemon       Off
    Log_Level    info
    Parsers_File parsers.conf
fluent-bit.conf:
@INCLUDE fluent-bit-service.conf
@INCLUDE fluent-bit-input.conf
@INCLUDE fluent-bit-filter.conf
@INCLUDE fluent-bit-output.conf
parsers.conf:

But I also tried https://raw.githubusercontent.com/fluent/fluent-bit-kubernetes-logging/master/output/elasticsearch/fluent-bit-configmap.yaml with my modifications.

I used Helm to install fluentbit with helm install stable/fluent-bit --name=fluent-bit --namespace=logging --set backend.type=es --set backend.es.host=elasticsearch --set on_minikube=true

I also get alot of the following errors:

log:{"took":0,"errors":true,"items":[{"index":{"_index":"kubernetes_cluster-2019.03.22","_type":"flb_type","_id":"YWCOp2kB4wEngjaDvxNB","status":400,"error":{"type":"mapper_parsing_exception","reason":"failed to parse","caused_by":{"type":"json_parse_exception","reason":"Duplicate field '@timestamp' at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@432f75a7; line: 1, column: 1248]"}}}}]}

and

log:[2019/03/22 22:38:57] [error] [out_es] could not pack/validate JSON response stream:stderr

as I can see in Kibana.

like image 756
imbageek Avatar asked Mar 22 '19 21:03

imbageek


1 Answers

Problem was bad fluentbit configmap. This works:

apiVersion: v1
kind: ConfigMap
metadata:
  name: fluent-bit-config
  namespace: logging
  labels:
    k8s-app: fluent-bit
data:
  # Configuration files: server, input, filters and output
  # ======================================================
  fluent-bit.conf: |
    [SERVICE]
        Flush         1
        Log_Level     info
        Daemon        off
        Parsers_File  parsers.conf
        HTTP_Server   On
        HTTP_Listen   0.0.0.0
        HTTP_Port     2020        
    @INCLUDE input-kubernetes.conf
    @INCLUDE filter-kubernetes.conf
    @INCLUDE output-elasticsearch.conf
  input-kubernetes.conf: |
    [INPUT]
        Name              tail
        Tag               kube.*
        Path              /var/log/containers/*.log
        Parser            docker
        DB                /var/log/flb_kube.db
        Mem_Buf_Limit     5MB
        Skip_Long_Lines   On
        Refresh_Interval  10
  filter-kubernetes.conf: |
    [FILTER]
        Name                kubernetes
        Match               kube.*
        Kube_URL            https://kubernetes.default.svc:443
        # These two may fix some duplicate field exception
        Merge_Log           On
        Merge_JSON_Key      k8s
        K8S-Logging.Parser  On
        K8S-Logging.exclude True
  output-elasticsearch.conf: |
    [OUTPUT]
        Name            es
        Match           *
        Host            ${FLUENT_ELASTICSEARCH_HOST}
        Port            ${FLUENT_ELASTICSEARCH_PORT}
        Logstash_Format On
        # This fixes errors where kubernetes.apps.name must object
        Replace_Dots    On 
        Retry_Limit     False
        Type            flb_type
        # This may fix some duplicate field exception
        Time_Key        @timestamp_es
        # The Index Prefix:
        Logstash_Prefix logstash_07
  parsers.conf: |
    [PARSER]
        Name   apache
        Format regex
        Regex  ^(?<host>[^ ]*) [^ ]* (?<user>[^ ]*) \[(?<time>[^\]]*)\] "(?<method>\S+)(?: +(?<path>[^\"]*?)(?: +\S*)?)?" (?<code>[^ ]*) (?<size>[^ ]*)(?: "(?<referer>[^\"]*)" "(?<agent>[^\"]*)")?$
        Time_Key time
        Time_Format %d/%b/%Y:%H:%M:%S %z
    [PARSER]
        Name   apache2
        Format regex
        Regex  ^(?<host>[^ ]*) [^ ]* (?<user>[^ ]*) \[(?<time>[^\]]*)\] "(?<method>\S+)(?: +(?<path>[^ ]*) +\S*)?" (?<code>[^ ]*) (?<size>[^ ]*)(?: "(?<referer>[^\"]*)" "(?<agent>[^\"]*)")?$
        Time_Key time
        Time_Format %d/%b/%Y:%H:%M:%S %z
    [PARSER]
        Name   apache_error
        Format regex
        Regex  ^\[[^ ]* (?<time>[^\]]*)\] \[(?<level>[^\]]*)\](?: \[pid (?<pid>[^\]]*)\])?( \[client (?<client>[^\]]*)\])? (?<message>.*)$
    [PARSER]
        Name   nginx
        Format regex
        Regex ^(?<remote>[^ ]*) (?<host>[^ ]*) (?<user>[^ ]*) \[(?<time>[^\]]*)\] "(?<method>\S+)(?: +(?<path>[^\"]*?)(?: +\S*)?)?" (?<code>[^ ]*) (?<size>[^ ]*)(?: "(?<referer>[^\"]*)" "(?<agent>[^\"]*)")?$
        Time_Key time
        Time_Format %d/%b/%Y:%H:%M:%S %z
    [PARSER]
        Name   json
        Format json
        Time_Key time
        Time_Format %d/%b/%Y:%H:%M:%S %z
    [PARSER]
        Name        docker
        Format      json
        #Time_Key    time
        Time_Key    @timestamp
        Time_Format %Y-%m-%dT%H:%M:%S.%L
        Time_Keep   Off # on
        # See: https://fluentbit.io/documentation/0.14/parser/decoder.html
        # Command      |  Decoder | Field | Optional Action
        # =============|==================|=================
        # Decode_Field_As   escaped    log
        # Decode_Field_As   escaped    log    do_next
        # Decode_Field_As   json       log     
    [PARSER]
        Name        syslog
        Format      regex
        Regex       ^\<(?<pri>[0-9]+)\>(?<time>[^ ]* {1,2}[^ ]* [^ ]*) (?<host>[^ ]*) (?<ident>[a-zA-Z0-9_\/\.\-]*)(?:\[(?<pid>[0-9]+)\])?(?:[^\:]*\:)? *(?<message>.*)$
        Time_Key    time
        Time_Format %b %d %H:%M:%S
like image 104
imbageek Avatar answered Nov 02 '22 16:11

imbageek