Let's say you have 2 very different types of logs such as technical and business logs and you want:
gelf
output,elasticsearch_http
output.I know that with Syslog-NG
for instance, the configuration file allow to define several distinct inputs which can then be processed separately before being dispatched; what Logstash
seems unable to do. Even if one instance can be initiated with two specific configuration files, all logs take the same channel and are being applied the same processings ...
Should I run as many instances as I have different types of logs?
Your Logstash pipeline can use multiple input and output plugins to handle these requirements. In this section, you create a Logstash pipeline that takes input from a Twitter feed and the Filebeat client, then sends the information to an Elasticsearch cluster as well as writing the information directly to a file.
You either want something similar to what @ITIC suggested, or you simply want to run the logstash instance once and have all your conf files be run. And then simply run logstash without any additional option (like bin/logstash from the logstash directory). It'll run all the pipelines specified in the pipelines.
Logstash is an open source data processing pipeline that ingests events from one or more inputs, transforms them, and then sends each event to one or more outputs. Some Logstash implementations may have many lines of code and may process events from multiple input sources.
Yes, both Filebeat and Logstash can be used to send logs from a file-based data source to a supported output destination.
Should I run as many instances as I have different types of logs?
No! You can only run one instance to handle different types of logs.
In the logstash configuration file, you can specific each input with different type. Then in the filter you can use if to distinct different processing, and also at the output you can use "if" output to different destination.
input {
file {
type => "technical"
path => "/home/technical/log"
}
file {
type => "business"
path => "/home/business/log"
}
}
filter {
if [type] == "technical" {
# processing .......
}
if [type] == "business" {
# processing .......
}
}
output {
if [type] == "technical" {
# output to gelf
}
if [type] == "business" {
# output to elasticsearch
}
}
Hope this can help you :)
I used tags for multiple file input:
input {
file {
type => "java"
path => "/usr/aaa/logs/stdout.log"
codec => multiline {
...
},
tags => ["aaa"]
}
file {
type => "java"
path => "/usr/bbb/logs/stdout.log"
codec => multiline {
...
}
tags => ["bbb"]
}
}
output {
stdout {
codec => rubydebug
}
if "aaa" in [tags] {
elasticsearch {
hosts => ["192.168.100.211:9200"]
index => "aaa"
document_type => "aaa-%{+YYYY.MM.dd}"
}
}
if "bbb" in [tags] {
elasticsearch {
hosts => ["192.168.100.211:9200"]
index => "bbb"
document_type => "bbb-%{+YYYY.MM.dd}"
}
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With