I am having some issues with my ELK system. The client-side work is as follows:
Filebeat -> Logstash --> Elastic --> Kibana
Parts of our logs don't arrive to Elastic from specific machines. I suspect that the problem is in the log harvesting in Filebeat.
I tried to find information on the debugging system on the Elastic and GitHub websites but I only found these links, which says:
By default, Filebeat sends all its output to syslog. When you run Filebeat in the foreground, you can use the -e command line flag to redirect the output to standard error instead. For example:
filebeat -e The default configuration file is filebeat.yml (the location of the file varies by platform). You can use a different configuration file by specifying the -c flag. For example:
filebeat -e -c myfilebeatconfig.yml
You can increase the verbosity of debug messages by enabling one or more debug selectors. For example, to view the published transactions, you can start Filebeat with the publish selector like this:
filebeat -e -d "publish"
If you want all the debugging output (fair warning, it’s quite a lot), you can use *, like this:
filebeat -e -d "*"
filebeat -e
doesn't show me what I need, and the other options provide too much information. Are there any other methods to debug our ELK with this specific architecture? Or are there any other command-line options?
FYI: I already tried to set FileBeat service on my machine, where it performs great with the same filebeat.yml
configuration.
# The # character at the beginning of a line indicates a comment. Use
# comments to describe your configuration.
input {
}
# The filter part of this file is commented out to indicate that it is
# optional.
# filter {
#
# }
output {
}
Do you have such configuration as part of your logstash config? If so any problem with inputs (filebeat in your case) would appear in logstash logs. I used grok filter on my filebeat logs so logs also told me if it can't parse the log lines.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With