After processing data with: input | filter | output > ElasticSearch the format it's get stored in is somewhat like:
"_index": "logstash-2012.07.02",
"_type": "stdin",
"_id": "JdRaI5R6RT2do_WhCYM-qg",
"_score": 0.30685282,
"_source": {
"@source": "stdin://dist/",
"@type": "stdin",
"@tags": [
"tag1",
"tag2"
],
"@fields": {},
"@timestamp": "2012-07-02T06:17:48.533000Z",
"@source_host": "dist",
"@source_path": "/",
"@message": "test"
}
I filter/store most of the important information in specific fields, is it possible to leave out the default fields like: @source_path and @source_host? In the near future it's going to store 8 billion logs/month and I would like to run some performance tests with this default fields excluded (I just don't use these fields).
This removes fields from output:
filter {
mutate {
# remove duplicate fields
# this leaves timestamp from message and source_path for source
remove => ["@timestamp", "@source"]
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With