Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Attaching a TTL field with every log sent via logstash to Elasticsearch

Summary: I want to attach a TTL field with the logs in logstash and send them over to the Elastic search.

I have already gone through the documentation but could not get much of it, since it is not very clear.

This is my config file in logstash.

input {
  stdin {
    type => "stdin-type"
  }
}

output {
  stdout { debug => true debug_format => "json"}
  elasticsearch {}
}

Now suppose that for each log that is read, I want to attach a TTL with it for say, 5 days.

I know how to activate the TTL option in elastic search. But What changes will I have to make in the elastic search configuration files is not very clear to me. The documentation asks to look for the mappings folder, but there is none in the elastic search download folder.

Looking for an expert help.

like image 505
user2359303 Avatar asked May 10 '13 15:05

user2359303


People also ask

How do I transfer data from Logstash to Elasticsearch?

To use this configuration, we must also set up Logstash to receive events from Beats. In this setup, the Beat sends events to Logstash. Logstash receives these events by using the Beats input plugin for Logstash and then sends the transaction to Elasticsearch by using the Elasticsearch output plugin for Logstash.

Can Logstash have multiple outputs?

Using Logstash multiple outputs Furthermore, we can forward the filtered data of Logstash either to a single output destination or multiple outputs by filtering the inputs in a specific manner, resulting in the outputs being distributed to that particular stream for each of the inputs received.

How do I send a log file to Elasticsearch?

You need to install Filebeat first which collects logs from all the web servers. After that need to pass logs from Filebeat -> Logstash. In Logstash you can format and drop unwanted logs based on Grok pattern. Forward logs from Logstash -> Elasticsearch for storing and indexing.

Can Logstash forward logs?

Yes, both Filebeat and Logstash can be used to send logs from a file-based data source to a supported output destination.


1 Answers

Have a look here if you want to put the mapping on file system. You have to go to the config folder and create here a folder called mappings, and another one with the name of the index within mappings. Since logstash creates by default an index per day, you'd better use the _default name for the folder, so that the mapping will be applied to all indexes. The file that you create under that folder must have the name of the type you want to apply the mapping to. I don't remember exactly what type logstash uses, thus I would use the _default_ mapping definition. Just call the file _default_.json and put the following content in it:

{
    "_default_" : {
        "_ttl" : { "enabled" : true }
    }
}

As you can see the name of the type must appear in both the filename and in its content.

Otherwise, you could avoid putting stuff on file system. You could create an index template containing your custom mapping, like the following:

{
    "template" : "logstash-*",
    "mappings" : {
        "_default_" : {
            "_ttl" : { "enabled" : true }
        }
    }
}

The mapping will then be applied to all the indices whose name matches the template pattern. If you use the _default_ mapping definition the mapping will be applied as default to all the types that are going to be created.

like image 100
javanna Avatar answered Jan 01 '23 18:01

javanna