I am parsing proxy logs with Logstash and its Grok filter. The logs contain quoted strings :
1438120705 [.....] "SEF-EDP8" - "C"
"/GPM/1023/5745-7/456V/"
With the Grok Debugger the following pattern works like a charm :
%{NUMBER:ts} [......] (-|"%{USERNAME:token1}") (-|%{DATA:token2}) (-|"%{WORD:token3}") (-|"%{DATA:token4}")
This does not work with Logstash's Grok because of the double quotes in the grok pattern. Logstash error log :
Error: Expected one of #, {, } at line 9, column 204 (byte 374) after
filter {
grok {
match => { "message" => "%{NUMBER:ts} [......] ("
So I use the QuotedString grok pattern instead :
%{NUMBER:ts} [......] (-|%{QS:token1}) (-|%{DATA:token2}) (-|%{QS:token3}) (-|%{QS:token4})
This works with the Grok Debugger as well, but quotes are extracted with quoted strings. It doesn't work with Logstash either :
token1 : ""SEF-EDP8"" token2 : null token3 : ""C"" token4 :
""/GPM/1023/5745-7/456V/""
How can I make it work with Logstash? How can I remove these unwanted extra double quotes?
Changing the outer double quotes to single quotes instead did the trick for me:
grok {
match => { "message" => 'SOME "TEXT QUOTED"' }
}
Hope it helps.
If you escape " with backslash then it works fine.
%{NUMBER:ts} [......] (-|"%{USERNAME:token1}") (-|%{DATA:token2}) (-|"%{WORD:token3}") (-|"%{DATA:token4}")
Your new string will look like
%{NUMBER:ts} [......] (-|\"%{USERNAME:token1}\") (-|%{DATA:token2}) (-|\"%{WORD:token3}") (-|\"%{DATA:token4}\")
Try gsub after you have extracted the fields with quotes
filter {
mutate {
gsub => [
"fieldname", "\"", ""
]
}
}
https://www.elastic.co/guide/en/logstash/current/plugins-filters-mutate.html#plugins-filters-mutate-gsub
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With