I am trying to read CSV data in logstash, but some how logstash isn't able to split strings considering them as csv
logstash config
input {
file {
path => [ "/root/logstash/temp.csv" ]
start_position => "beginning"
}
}
filter {
csv {
columns => ['A','B','C','D','E']
}
}
output {
stdout { }
}
Test CSV file
p,q,r,s,t
p,q,r,s,t
p,q,r,s,t
p,q,r,s,t
p,q,r,s,t
p,q,r,s,t
Output of logstash
2014-04-23T13:26:53.415+0000 0.0.0.0 p,q,r,s,t
2014-04-23T13:26:53.416+0000 0.0.0.0 p,q,r,s,t
2014-04-23T13:26:53.416+0000 0.0.0.0 p,q,r,s,t
2014-04-23T13:26:53.417+0000 0.0.0.0 p,q,r,s,t
2014-04-23T13:26:53.417+0000 0.0.0.0 p,q,r,s,t
2014-04-23T13:26:53.418+0000 0.0.0.0 p,q,r,s,t
Can somebody help me with this issue?
1) I have tried replacing single quote with double quotes in Columns
2) I tried with different data
I am expecting columnar output as mentioned in this link https://blog.trifork.com/2014/01/28/using-logstash-elasticsearch-and-kibana-to-monitor-your-video-card-a-tutorial/
In the output you need to specify the codec.
For example, with your configuraiton,
input {
file {
path => [ "/root/logstash/temp.csv" ]
start_position => "beginning"
}
}
filter {
csv {
columns => ['A','B','C','D','E']
}
}
output {
stdout {
codec => rubydebug
}
}
Add the codec and then you can get what you want.
{
"message" => [
[0] "p,q,r,s,t"
],
"@version" => "1",
"@timestamp" => "2014-04-24T02:57:37.099Z",
"A" => "p",
"B" => "q",
"C" => "r",
"D" => "s",
"E" => "t"
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With