Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Converting date format to YYYY-MM-DD from YYYY/MM/DD HH:MM:SS format in Logstash for nginx error logs

I am having nginx error logs of the below form:-

2015/09/30 22:19:38 [error] 32317#0: *23 [lua] responses.lua:61: handler(): Cassandra error: Error during UNIQUE check: Cassandra error: connection refused, client: 127.0.0.1, server: , request: "POST /consumers/ HTTP/1.1", host: "localhost:8001"

As mentioned here I am able to parse this logs.

My filter configuration is like the below:-

filter {  
  grok {
      match => {
        "message" => [
          "%{DATESTAMP:mydate} \[%{DATA:severity}\] (%{NUMBER:pid:int}#%{NUMBER}: \*%{NUMBER}|\*%{NUMBER}) %{GREEDYDATA:mymessage}",
          "%{DATESTAMP:mydate} \[%{DATA:severity}\] %{GREEDYDATA:mymessage}",
          "%{DATESTAMP:mydate} %{GREEDYDATA:mymessage}"
        ]
      }
      add_tag => ["nginx_error_pattern"]
    }

    if ("nginx_error_pattern" in [tags]) {      
      grok {
        match => {
          "mymessage" => [
            "server: %{DATA:[request_server]},"
          ]
        }        
      }

      grok {
        match => {
          "mymessage" => [
            "host: \"%{IPORHOST:[request_host]}:%{NUMBER:[port]}\""
          ]
        }        
      }

      grok {
        match => {
          "mymessage" => [
            "request: \"%{WORD:[request_method]} %{DATA:[request_uri]} HTTP/%{NUMBER:[request_version]:float}\""
          ]
        }        
      }

      grok {
        match => {
          "mymessage" => [
            "client: %{IPORHOST:[clientip]}",
            "client %{IP:[clientip]} "
          ]
        }        
      }

      grok {
        match => {
          "mymessage" => [
            "referrer: \"%{DATA:[request_referrer]}\""
          ]
        }       
      }                
    }
}

mydate is having date of the form:-

"mydate" => "15/09/30 22:19:38"

Can someone let me know how can I add one more field (let's say log_day) having date of the form 2015-09-30?

like image 324
tuk Avatar asked Oct 06 '15 12:10

tuk


1 Answers

It is always a good idea to save the time/date in a field of type date. It enables you to do complex range queries with Elasticsearch or Kibana.

You can use logstash's date filter to parse the date.

Filter:

date {
    match => [ "mydate", "YY/MM/dd HH:mm:ss" ]
}

Result:

"@timestamp" => "2015-09-30T20:19:38.000Z"

The date filter puts the result in the @timestamp field by default.

To avoid the default mapping into @timestamp field, specify the target field like "log_day", such as following:

Filter:

date {
    match => [ "mydate", "YY/MM/dd HH:mm:ss" ]
    target => "log_day"
}

Result:

"log_day" => "2015-09-30T20:19:38.000Z"

Once you have a field of type date you can proceed with further operations. You might use the date_formatter filter to create another date field in your special format.

date_formatter {
        source => "log_day"
        pattern => "YYYY-MM-dd"
}

Result: "log_day" => "2015-09-30"

like image 94
hurb Avatar answered Nov 06 '22 14:11

hurb