Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Creating a geopoint object from CSV (latitude and longitude columns) in Logstash

I have a CSV with the columns latitude and longitude and I'm trying to create a geopoint object in Logstash 2.3.3 so that I can visualize these values in Kibana 4.5.1.

However when visualizing the data in Kibana, I see location.lat and location.lon, both of type float and not a location of type geopoint.

I'm new to ELK in general and this is driving me crazy. Especially because most of the information that I'm finding is outdated.

The .conf file that I'm using looks like this:

input {  
    file {
        path => "C:/file.csv"
        start_position => "beginning"    
    }
}

filter {  
    csv {
        separator => ","
        columns => ["longitude","latitude"]
    }

    mutate { convert => {"latitude" => "float"} }
    mutate { convert => {"longitude" => "float"} }
    mutate { rename => {"latitude" => "[location][lat]"} }
    mutate { rename => {"longitude" => "[location][lon]"} }
    mutate { convert => { "[location]" => "float" } }
}

output {  
    elasticsearch {
        template => "...\elasticsearch-template.json"
        template_overwrite => true
        action => "index"
        hosts => "localhost"
        index => "testindex1"
        workers => 1
    }
    stdout {}
}

The template file I'm specifying (elasticsearch-template.json) is the following:

{
  "template" : "logstash-*",
  "settings" : {
    "index.refresh_interval" : "5s"
  },
  "mappings" : {
    "_default_" : {
      "_all" : {"enabled" : true, "omit_norms" : true},
      "dynamic_templates" : [ {
        "message_field" : {
          "match" : "message",
          "match_mapping_type" : "string",
          "mapping" : {
            "type" : "string", "index" : "analyzed", "omit_norms" : true,
            "fielddata" : { "format" : "disabled" }
          }
        }
      }, {
        "string_fields" : {
          "match" : "*",
          "match_mapping_type" : "string",
          "mapping" : {
            "type" : "string", "index" : "analyzed", "omit_norms" : true,
            "fielddata" : { "format" : "disabled" },
            "fields" : {
              "raw" : {"type": "string", "index" : "not_analyzed", "ignore_above" : 256}
            }
          }
        }
      } ],
      "properties" : {
        "@timestamp": { "type": "date" },
        "@version": { "type": "string", "index": "not_analyzed" },
        "geoip"  : {
          "dynamic": true,
          "properties" : {
            "ip": { "type": "ip" },
            "location" : { "type" : "geo_point" },
            "latitude" : { "type" : "float" },
            "longitude" : { "type" : "float" }
          }
        },
  "location" : { "type": "geo_point" }
      }
    }
  }
}

If anyone could help me or give me some insight as to what I'm doing wrong, I would be very grateful. Also I'm sure this will help everyone who is on the same boat as me.

I solved it and it is now working perfectly. The template was looking for an index of the type logstash-* and I was using testindex1. Changing my index to logstash-%{+dd.MM.YYYY} fixed it.

like image 460
naranjja Avatar asked Jun 23 '16 22:06

naranjja


1 Answers

You need to remove the last mutate filter which defeats the purpose of what you're trying to achieve.

Also you need to make sure that the testindex1 mapping is faithfully containing the mapping you have in your elasticsearch-template.json file

like image 103
Val Avatar answered Mar 16 '23 22:03

Val