Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Logstash, split event from an xml file in multiples documents keeping information from root tags

My problem: I have XML files that contain events I want to parse using Logstash to request it using Kibana after. I want to keep all the information from the ROOT tag in each event.

Input looks like :

<?xml version="1.0" encoding="UTF-8"?>
<ROOT number="34">
  <EVENTLIST>
    <EVENT name="hey"/>
    <EVENT name="you"/>
  </EVENTLIST>
</ROOT>

What I want, two documents like that:

{
  "number":"34"
  "name": "Hey"
}
{
  "number":"34"
  "name": "you"
}

Logstash conf:

input {
  stdin { }
}
filter {
  xml {
    store_xml => "false"
    source => "message"
    target => "EVENT"
    xpath => [
      "/ROOT/@number","number",
      "/ROOT/EVENTLIST/EVENT/@name","name"
    ]
  }
}
output { elasticsearch { host => localhost } stdout { codec => rubydebug } }

Didnt work. What I get :

{
  "number" : ["34"]
  "name":["hey,"you""]
}

I followed the solution of this post : https://serverfault.com/questions/615196/logstash-parsing-xml-document-containing-multiple-log-entries

But my problem remains, I lose information from root tag.

One of solution could be to use some ruby filter to handle that, but I don't know ruby. Another is to use some java programing to convert XML into JSON before sending it to elasticsearch...

Any ideas to handle that or do I have to learn ruby?

like image 587
jnaour Avatar asked Oct 14 '14 13:10

jnaour


2 Answers

Try this filter:

filter {
  xml {
    source => "message"
    target => "xml_content"
  }
  split {
    field => "xml_content[EVENTLIST]"
  }
  split {
    field => "xml_content[EVENTLIST][EVENT]"
  }
  mutate {
    add_field => { "number" => "%{xml_content[number]}" }
    add_field => { "name" => "%{xml_content[EVENTLIST][EVENT][name]}" }
    remove_field => ['xml_content', 'message', 'path']
  }
}
output {
  stdout {
    codec => rubydebug
  }
}

It returns this events:

{
        "number" => "34",
    "@timestamp" => 2016-12-23T12:01:17.888Z,
      "@version" => "1",
          "host" => "xubuntu",
          "name" => "hey"
    ]
}
{
        "number" => "34",
    "@timestamp" => 2016-12-23T12:01:17.888Z,
      "@version" => "1",
          "host" => "xubuntu",
          "name" => "you"
    ]
}
like image 111
drinor Avatar answered Nov 15 '22 12:11

drinor


If your structure is as simple as you show, you can use a memorize plugin that I wrote.

Your configuration would look something like this:

filter {
  if ([message] =~ /<ROOT/) {
    grok {
      match => [ "message", 
        'number="(?<number>\d+)" number2="(?<number1>\d+)"'
      ] 
    }
  } else if ([message] =~ /<EVENT /) {
    grok { 
      match => [ "message", 'name="(?<name>[^"]+)"']
    }
  }
  memorize {
    fields => ["number","number1"]
  }
  if ([message] !~ /<EVENT /) {
    drop {}
  } else {
    mutate { remove_field => ["message"] }
  }
}

My example shows looking for multiple things in the ROOT element based on your comments below. And here's the version of the plugin that supports memorizing multiple fields:

# encoding: utf-8
require "logstash/filters/base"
require "logstash/namespace"
require "set"
#
# This filter will look for fields from an event and record the last value
# of them.  If any are not present, their last value will be added to the
# event
#
# The config looks like this:
#
#     filter {
#       memorize {
#         fields => ["time"]
#         default => { "time" => "00:00:00.000" }
#       }
#     }
#
# The `fields` is an array of the field NAMES that you want to memorize
# The `default` is a map of field names to field values that you want
# to use if the field isn't present and has no memorized value (optional)

class LogStash::Filters::Memorize < LogStash::Filters::Base

  config_name "memorize"
  milestone 2

  # An array of the field names to to memorize
  config :fields, :validate => :array, :required => true
  # a map for default values to use if its not seen before we need it
  config :default, :validate => :hash, :required => false

  # The stream identity is how the filter determines which stream an
  # event belongs to. See the multiline plugin if you want more details on how
  # this might work
  config :stream_identity , :validate => :string, :default => "%{host}.%{path}.%{type}"

  public
  def initialize(config = {})
    super

    @threadsafe = false

    # This filter needs to keep state.
    @memorized = Hash.new
  end # def initialize

  public
  def register
    # nothing needed
  end # def register

  public
  def filter(event)
    return unless filter?(event)

    any = false
    @fields.each do |field|
      if event[field].nil?
    map = @memorized[@stream_identity]
        val = map.nil? ? nil : map[field]
        if val.nil?
          val = @default.nil? ? nil : @default[field]
        end
    if !val.nil?
          event[field] = val
          any = true
    end
      else
        map = @memorized[@stream_identity]
    if map.nil?
          map = @memorized[@stream_identity] = Hash.new
    end
    val = event[field]
    map[field] = event[field]
      end #if
      if any
        filter_matched(event)
      end
    end #field.each
  end
end

For logstash 1.5 and later, this plugin is available for installation via

bin/plugin install logstash-filter-memorize
like image 29
Alcanzar Avatar answered Nov 15 '22 12:11

Alcanzar