Logstash Grok
April 25, 2018
I'm using FileBeat and Logstash to load log files into Elasticsearch so that they are visible in Kibana. I was struggling with creating the correct grok message format to read the log entries in and make them available in Kibana.
The tool at http://grokdebug.herokuapp.com/ was invaluable in debugging and trying out ideas!
My log line is the following format:
INFO [2018-04-24 09:49:13,255] uk.co.drumcoder.DrumcoderService: [id=ABC123] Sending message
Here's the grok matcher that will match this log string.
%{LOGLEVEL:severity}%{SPACE}\[%{TIMESTAMP_ISO8601:tstamp}\]%{SPACE}%{JAVACLASS:class}:%{SPACE}\[id=(?<messageid>[-_A-Za-z0-9]*)\]%{SPACE}(?<logmsg>.*)
The full filter configuration within logstash is:
filter { grok { match => { "log" => "%{LOGLEVEL:severity}%{SPACE}\[%{TIMESTAMP_ISO8601:timestamp}\]%{SPACE}%{JAVACLASS:class}:%{SPACE}\[id=(?<messageid>[-_A-Za-z0-9]*)\]%{SPACE}(?<logmsg>.*)"} match => { "log" => "%{LOGLEVEL:severity}%{SPACE}\[%{TIMESTAMP_ISO8601:timestamp}\]%{SPACE}%{JAVACLASS:class}:%{SPACE}(?<logmsg>.*)"} } date { match => ["timestamp", "yyyy-MM-dd HH:mm:ss,SSS", "ISO8601"] target => "@timestamp" } ruby { code => "event.set('message', event.get('log').scan(/[^\r\n]+[\r\n]*/));" } mutate { replace => {"message" => "%{message[0]}"} } }
Problems
- For some reason, nothing worked until I added
json.add_error_key: true
in the filebeat configuration.