I have a custom generated log file that has the following pattern :
[2014-03-02 17:34:20] - 127.0.0.1|ERROR| E:\xampp\htdocs\test.php|123|subject|The error message goes here ; array ( 'create' => array ( 'key1' => 'value1', 'key2' => 'value2', 'key3' => 'value3' ), ) [2014-03-02 17:34:20] - 127.0.0.1|DEBUG| flush_multi_line
The second entry [2014-03-02 17:34:20] - 127.0.0.1|DEBUG| flush_multi_line
Is a dummy line, just to let logstash know that the multi line event is over, this line is dropped later on.
My config file is the following :
input { stdin{} } filter{ multiline{ pattern => "^\[" what => "previous" negate=> true } grok{ match => ['message',"\[.+\] - %{IP:ip}\|%{LOGLEVEL:loglevel}"] } if [loglevel] == "DEBUG"{ # the event flush line drop{} }else if [loglevel] == "ERROR" { # the first line of multievent grok{ match => ['message',".+\|.+\| %{PATH:file}\|%{NUMBER:line}\|%{WORD:tag}\|%{GREEDYDATA:content}"] } }else{ # its a new line (from the multi line event) mutate{ replace => ["content", "%{content} %{message}"] # Supposing each new line will override the message field } } } output { stdout{ debug=>true } }
The output for content field is : The error message goes here ; array (
My problem is that I want to store the rest of the multiline to content field :
The error message goes here ; array ( 'create' => array ( 'key1' => 'value1', 'key2' => 'value2', 'key3' => 'value3' ), )
So i can remove the message field later.
The @message field contains the whole multiline event so I tried the mutate filter, with the replace function on that, but I'm just unable to get it working :( .
I don't understand the Multiline filter's way of working, if someone could shed some light on this, it would be really appreciated.
Thanks,
Abdou.
Logstash has the ability to parse a log file and merge multiple log lines into a single event. You can do this using either the multiline codec or the multiline filter, depending on the desired effect. A codec is attached to an input and a filter can process events from multiple inputs.
Logstash receives the logs using input plugins and then uses the filter plugins to parse and transform the data. The parsing and transformation of logs are performed according to the systems present in the output destination. Logstash parses the logging data and forwards only the required fields.
Although the Logstash file input plugin is a great way to get started developing configurations, Filebeat is the recommended product for log collection and shipment off host servers. Filebeat can output logs to Logstash, and Logstash can receive and process these logs with the Beats input.
The filters of Logstash measures manipulate and create events like Apache-Access. Many filter plugins used to manage the events in Logstash. Here, in an example of the Logstash Aggregate Filter, we are filtering the duration every SQL transaction in a database and computing the total time.
I went through the source code and found out that :
Here is the working code :
input { stdin{} } filter{ if "|ERROR|" in [message]{ #if this is the 1st message in many lines message grok{ match => ['message',"\[.+\] - %{IP:ip}\|%{LOGLEVEL:loglevel}\| %{PATH:file}\|%{NUMBER:line}\|%{WORD:tag}\|%{GREEDYDATA:content}"] } mutate { replace => [ "message", "%{content}" ] #replace the message field with the content field ( so it auto append later in it ) remove_field => ["content"] # we no longer need this field } } multiline{ #Nothing will pass this filter unless it is a new event ( new [2014-03-02 1.... ) pattern => "^\[" what => "previous" negate=> true } if "|DEBUG| flush_multi_line" in [message]{ drop{} # We don't need the dummy line so drop it } } output { stdout{ debug=>true } }
Cheers,
Abdou
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With