I have a machine with logstash on it, and another Elasticsearch-Kibana machine which stores the logs written from logstash on the first machine. Naturally, I want no logs to be kept on the origin machine and handle logging only on the Elasticsearch cluster.
Unfortunately, logstash creates huge log files on the first machine (where nothing should be kept):
I have only one file under /etc/logstash
on the origin machine, and as far as I can see, the configuration does not specify a local output:
input {
tcp {
port => 5959
codec => json
}
udp {
port => 5959
}
}
filter{
json{
source => "message"
}
}
filter{
if [@message] == "Incoming Event" {
mutate{
add_field => {
"location" => "%{@fields[location]}"
}
}
}
}
output {
elasticsearch {
# The host in which elasticsearch and Kibana live
host => "some.internal.aws.ip"
}
}
How can I stop logstash from writing local logs by configuration? I know I can cron-del them, but I think that prevention is less error-prone.
I had the same problem as you running on a CentOS 7 machine. No output to anything else than elasticsearch but logstash still output all the incoming messages to logstash.log
and logstash.stdout
After a bit of research of the actual ruby-code it turned out that the default logging mode is very verbose.
There is however a flag (seems to be undocumented as far as I can see) called --quiet
which solves the problem.
Add the flag to the LS_OPTS
-variable either in the config file (/etc/sysconfig/logstash
on centos) or directly in the init.d
script like so:
# Arguments to pass to logstash agent
LS_OPTS="--quiet"
This output is likely caused by having the following in one of your output config files:
stdout { codec => rubydebug }
After removing that from my 30-output.conf
logstash stopped being so verbose
This thread led me to the answer.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With