Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Log4j logging directly to elasticsearch server

I'm a bit confused on how can I put my log entries directly to elasticsearch (not logstash). So far I found a few appenders (log4j.appender.SocketAppender, log4j.appender.server etc.) that allow to send logs to remote host and also ConversionPattern possibility that seems to allow us to convert logs to "elastic-friendly" format, but this approach looks freaky... or do I mistake? Is this the one way to send logs to elastic?

So far I have a such config:

log4j.rootLogger=DEBUG, server
log4j.appender.server=org.apache.log4j.net.SocketAppender
log4j.appender.server.Port=9200
log4j.appender.server.RemoteHost=localhost
log4j.appender.server.ReconnectionDelay=10000
log4j.appender.server.layout.ConversionPattern={"debug_level":"%p","debug_timestamp":"%d{ISO8601}","debug_thread":"%t","debug_file":"%F", "debug_line":"%L","debug_message":"%m"}%n

But I get an error:

log4j:WARN Detected problem with connection: java.net.SocketException: Broken pipe (Write failed)

I can't find any useful example so I can't understand what do I do wrong and how to fix it. Thanks.

like image 890
Frankie Drake Avatar asked May 26 '17 10:05

Frankie Drake


People also ask

Can you send logs directly to Elasticsearch?

Logs can be sent to Elasticsearch in different ways, including directly from the application and by using a data shipper such as Filebeat.

Is log4j used in Elasticsearch?

Elasticsearch uses Log4j 2 for logging. Log4j 2 can be configured using the log4j2. properties file.

Is Kibana using log4j?

The way logging works in Kibana is inspired by the log4j 2 logging framework used by Elasticsearch.


3 Answers

I've written this appender here Log4J2 Elastic REST Appender if you want to use it. It has the ability to buffer log events based on time and/or number of events before sending it to Elastic (using the _bulk API so that it sends it all in one go). It has been published to Maven Central so it's pretty straight forward.

like image 126
Marcelo Grossi Avatar answered Oct 12 '22 10:10

Marcelo Grossi


If you'd like to check out something new, my Log4j2 Elasticsearch Appenders will give you async logging in batches with failover.

like image 28
rfoltyns Avatar answered Oct 12 '22 09:10

rfoltyns


I found solution that fits my requirements most. It's a graylog . Since it's build based on elasticsearch the usage is familiar so I was able to switch to it immediately.

To use it I added this dependency along with basic log4j2 dependencies:

<dependency>
    <groupId>org.graylog2.log4j2</groupId>
    <artifactId>log4j2-gelf</artifactId>
    <version>1.3.2</version>
</dependency>

and use log4j2.json configuration:

{
  "configuration": {
    "status": "info",
    "name": "LOGGER",
    "packages": "org.graylog2.log4j2",
    "appenders": {
      "GELF": {
        "name": "GELF",
        "server": "log.myapp.com",
        "port": "12201",
        "hostName": "my-awsome-app",
        "JSONLayout": {
          "compact": "false",
          "locationInfo": "true",
          "complete": "true",
          "eventEol": "true",
          "properties": "true",
          "propertiesAsList": "true"
        },
        "ThresholdFilter": {
          "level": "info"
        }
      }
    },
    "loggers": {
      "logger": [
        {
          "name": "io.netty",
          "level": "info",
          "additivity": "false",
          "AppenderRef": {
            "ref": "GELF"
          }
        }        
      ],
      "root": {
        "level": "info",
        "AppenderRef": [
          {
            "ref": "GELF"
          }
        ]
      }
    }
  }
}
like image 1
Frankie Drake Avatar answered Oct 12 '22 11:10

Frankie Drake