I am trying to load CSV file in logstash but it is not reading the file and not creating the index in elasticsearch I need to read the CSV file in elasticsearch. Tried few changes in config file.
My Config file
input {
file {
type => "csv"
path => "/root/installables/*.csv"
start_position => beginning
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
}
output {
elasticsearch {
hosts => localhost
index => "client"
}
}
Could anybody tell how to load CSV file in logstash?
You need to set: sincedb_path => "/dev/null" and not "NULL" in the input, if you want to reread the file. /dev/null is needed in linux system and I am using windows so it should be NULL .
I think you should put a "csv" filter. I make it work like this:
input {
file {
path => "/filepath..."
start_position => beginning
# to read from the beginning of file
sincedb_path => "/dev/null"
}
}
filter {
csv {
columns => ["COL1", "COL2"]
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
host => "localhost"
index => "csv_index"
}
}
Also, adding stdout as output helps you to debug and know if the file is loading
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With