Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using Log4J with LogStash

Tags:

log4j

logstash

I'm new to LogStash. I have some logs written from a Java application in Log4J. I'm in the process of trying to get those logs into ElasticSearch. For the life of me, I can't seem to get it to work consistently. Currently, I'm using the following logstash configuration:

input {   file {     type => "log4j"     path => "/home/ubuntu/logs/application.log"   } } filter {   grok {     type => "log4j"     add_tag => [ "ApplicationName" ]     match => [ "message", "%{TIMESTAMP_ISO8601:timestamp}  %{LOGLEVEL:level}" ]   } } output {   elasticsearch {     protocol => "http"     codec => "plain"     host => "[myIpAddress]"     port => "[myPort]"   } } 

This configuration seems to be hit or miss. I'm not sure why. For instance, I have two messages. One works, and the other throws a parse failure. Yet, I'm not sure why. Here are the messages and their respective results:

Tags                   Message ------                 ------- ["_grokparsefailure"]  2014-04-04 20:14:11,613 TRACE c.g.w.MyJavaClass [pool-2-                         thread-6] message was null from https://domain.com/id-1/env-                        MethodName  ["ApplicationName"]    2014-04-04 20:14:11,960 TRACE c.g.w.MyJavaClass [pool-2-                        thread-4] message was null from https://domain.com/id-1/stable-                        MethodName 

The one with ["ApplicationName"] has my custom fields of timestamp and level. However, the entry with ["_grokparsefailure"] does NOT have my custom fields. The strange piece is, the logs are nearly identical as shown in the message column above. This is really confusing me, yet, I don't know how to figure out what the problem is or how to get beyond it. Does anyone know how how I can use import log4j logs into logstash and get the following fields consistently:

  • Log Level
  • Timestamp
  • Log message
  • Machine Name
  • Thread

Thank you for any help you can provide. Even if I can just the log level, timestamp, and log message, that would be a HUGE help. I sincerely appreciate it!

like image 617
user3469584 Avatar asked Apr 04 '14 20:04

user3469584


People also ask

Is log4j used in Logstash?

In a default Logstash install, the Log4j plugin is installed but not enabled. If you aren't explicitly using this plugin in your configuration, you are not affected by this issue.

How do you mitigate log4j vulnerability for Logstash?

In previous releases (>2.10) this behavior can be mitigated by setting system property "log4j2. formatMsgNoLookups" to “true” or it can be mitigated in prior releases (<2.10) by removing the JndiLookup class from the classpath (example: zip -q -d log4j-core-*. jar org/apache/logging/log4j/core/lookup/JndiLookup.

Can Logstash pull logs?

Logstash supports a variety of inputs that pull in events from a multitude of common sources, all at the same time. Easily ingest from your logs, metrics, web applications, data stores, and various AWS services, all in continuous, streaming fashion.

Does Elasticsearch use log4j?

Elasticsearch uses Log4j 2 for logging. Log4j 2 can be configured using the log4j2. properties file.


2 Answers

I'd recommend using the log4j socket listener for logstash and the log4j socket appender.

Logstash conf:

input {   log4j {     mode => server     host => "0.0.0.0"     port => [logstash_port]     type => "log4j"   } } output {   elasticsearch {     protocol => "http"     host => "[myIpAddress]"     port => "[myPort]"   } } 

log4j.properties:

log4j.rootLogger=[myAppender] log4j.appender.[myAppender]=org.apache.log4j.net.SocketAppender log4j.appender.[myAppender].port=[log4j_port] log4j.appender.[myAppender].remoteHost=[logstash_host] 

There's more info in the logstash docs for their log4j input: http://logstash.net/docs/1.4.2/inputs/log4j

like image 101
ranxxerox Avatar answered Sep 28 '22 21:09

ranxxerox


It looks like the SocketAppender solution that was used before is deprecated because of some security issue. Currently the recommended solution is to use log4j fileAppender and then pass the file through filebeat plugin to logstash and then filter. For more information you can refer the below links:

https://www.elastic.co/blog/log4j-input-logstash

https://www.elastic.co/guide/en/logstash/current/plugins-inputs-log4j.html

like image 42
Arijit B Avatar answered Sep 28 '22 22:09

Arijit B