Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Logstash/ElasticSearch: guesses wrong for datatype for field

The log files I'm trying to import into Logstash contain a field that sometimes looks like a date/time and sometimes does not. Unfortunately, the first occurrence looked like a date/time and someone (logstash or elasticsearch) decided to define the field as a date/time. When trying to import a later log record, Elasticsearch has an exception:

Failed to execute [index ...]  
org.elasticsearch.index.mapper.MapperParsingException: Failed to parse [@fields.field99]  
at org.elasticsearch.index.mapper.core.AbstractFieldMapper.parse(AbstractFieldMapper.java:320)  
at org.elasticsearch.index.mapper.object.ObjectMapper.serializeValue(ObjectMapper.java:587)  
...  
Caused by: java.lang.IllegalArgumentException: Invalid format: "empty"  
at org.elasticsearch.common.joda.time.format.DateTimeFormatter.parseMillis(DateTimeFormatter.java:747)  
...  

Question: How do I tell logstash/elasticsearch to not define this field as a date/time? I would like all the fields from my log (except the one explicit timestamp field) to be defined as just text.

Question: it appears that logstash gives up trying to import records from the log file after seeing this one that elasticsearch throws an exception on. How can I tell logstash to ignore this exception and keep trying to import the other records from the log file?

like image 599
allen Avatar asked Apr 01 '13 19:04

allen


1 Answers

I found the answer to my first question myself.

Before adding data through Logstash, I had to set the defaults for Elasticsearch to treat the field as "string" instead of "date".

I did this by creating a defaults.js file like this:

{  
    "template": "logstash-*",  
    "mappings": {  
        `"_default_"`: {  
            "dynamic_templates": [{  
                "fields_template": {  
                    "mapping": { "type": "string" },  
                    "path_match": "@fields.*"  
                }  
            }]  
        }  
    }  
}

and telling Elasticsearch to use it before adding any data through Logstash:

curl -XPUT 'http://localhost:9200/_template/template_logstash/' -d @defaults_for_elasticsearch.js

Hope this helps someone else.

like image 55
allen Avatar answered Sep 29 '22 06:09

allen