Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Logstash agent not indexing anymore

I have a Logstash instance running as a service that reads from Redis and outputs to Elasticsearch. I just noticed there was nothing new in Elasticsearch for the last few days, but the Redis lists were increasing.

Logstash log was filled with 2 errors repeated for thousands of lines:

:message=>"Got error to send bulk of actions"
:message=>"Failed to flush outgoing items"

The reason being:

{"error":"IllegalArgumentException[Malformed action/metadata line [107], expected a simple value for field [_type] but found [START_ARRAY]]","status":500}, 

Additionally, trying to stop the service failed repeatedly, I had to kill it. Restarting it emptied the Redis lists and imported everything to Elasticsearch. It seems to work ok now.

But I have no idea how to prevent that from happening again. The mentioned type field is set as a string for each input directive, so I don't understand how it could have become an array.
What am I missing?

I'm using Elasticsearch 1.7.1 and Logstash 1.5.3. The logstash.conf file looks like this:

input {
  redis {
    host => "127.0.0.1"
    port => 6381
    data_type => "list"
    key => "b2c-web"
    type => "b2c-web"
    codec => "json"
  }
  redis {
    host => "127.0.0.1"
    port => 6381
    data_type => "list"
    key => "b2c-web-staging"
    type => "b2c-web-staging"
    codec => "json"
  }

    /* other redis inputs, only key/type variations */
}
filter {
  grok {
    match => ["msg", "Cache hit %{WORD:query} in %{NUMBER:hit_total:int}ms. Network: %{NUMBER:hit_network:int} ms.     Deserialization %{NUMBER:hit_deserial:int}"]
    add_tag => ["cache_hit"]
    tag_on_failure => []
  }
  /* other groks, not related to type field */
}
output {
  elasticsearch {
    host => "[IP]"
    port => "9200"
    protocol=> "http"
    cluster => "logstash-prod-2"
  }
}
like image 455
Antoine Avatar asked Sep 02 '15 13:09

Antoine


1 Answers

According to your log message:

{"error":"IllegalArgumentException[Malformed action/metadata line [107], expected a simple value for field [_type] but found [START_ARRAY]]","status":500},

It seems you're trying to index a document with a type field that's an array instead of a string.

I can't help you without more of the logstash.conf file. But check followings to make sure:

  1. When you use add_field for changing the type you actually turn type into an array with multiple values, which is what Elasticsearch is complaining about.

  2. You can use mutate join to convert arrays to strings: api link

    filter {
        mutate {
            join => { "fieldname" => "," }
        }
    }
    
like image 127
Dulguun Avatar answered Nov 16 '22 08:11

Dulguun