Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Logs to be sent from java app to logstash

I am trying to push my logs from java application to logstash on port 4512. But, when I run logstash, I see that the logs on logstash are seen with junk characters (as shown with snippet below). Can someone suggest on how to handle from application end so that both java application and logstash can interact as needed ?

package com.logging.messages.Messager;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public class SockerLogger {
    public static void main(String[] args)  {
        Logger log = LoggerFactory.getLogger(SockerLogger.class);
        log.info("Info");
        log.debug("DEBUG");
    }
}

log4j.properties of java app:

log4j.rootLogger=INFO, server
log4j.appender.server=org.apache.log4j.net.SocketAppender
log4j.appender.server.Port=4512
log4j.appender.server.RemoteHost=localhost
log4j.appender.server.ReconnectionDelay=10000
log4j.appender.server.layout.ConversionPattern={"debug_level":"%p","debug_timestamp":"%d{ISO8601}","debug_thread":"%t","debug_file":"%F", "debug_line":"%L","debug_message":"%m"}%n

logstash properties/conf file:

# Specifying Input Host and Port number for Retriveing Application Log Messages

input {
   tcp {
    port => "4512"
#    type => "log"
     codec => "json"
 }
}

#filter {
#    grok {
#       match => [ "message" => "_@timestamp","yyyy-MM-dd HH:mm:ss,SSS" ]
# }
#}


# Pushing Log Messages from Logstash to Elastic Search

output {
#  elasticsearch {
#       hosts => ["localhost:9200"]
#       index => "logshub"
# }
 stdout { codec => plain }
}

pom.xml dependencies:

<dependencies>
    <!-- https://mvnrepository.com/artifact/org.apache.logging.log4j/log4j-core -->
    <dependency>
        <groupId>org.apache.logging.log4j</groupId>
        <artifactId>log4j-core</artifactId>
        <version>2.9.0</version>
    </dependency>

    <dependency>
        <groupId>org.slf4j</groupId>
        <artifactId>slf4j-api</artifactId>
        <version>1.7.25</version>
    </dependency>
    <dependency>
        <groupId>org.slf4j</groupId>
        <artifactId>slf4j-simple</artifactId>
        <version>1.7.25</version>
        <scope>test</scope>
    </dependency>
    <dependency>
        <groupId>org.slf4j</groupId>
        <artifactId>slf4j-log4j12</artifactId>
        <version>1.7.25</version>
    </dependency>
</dependencies>

Logstash console output

[2017-09-20T18:29:59,436][WARN ][logstash.codecs.jsonlines] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unrecognized token 'ZmdcCopyLookupRequiredZndcLookupRequiredJ': was expecting ('true', 'false' or 'null')
 at [Source: ZmdcCopyLookupRequiredZndcLookupRequiredJ  timeStampL
categoryNametLjava/lang/String;L
locationInfot#Lorg/apache/log4j/spi/LocationInfo;LmdcCopytLjava/util/Hashtable;Lndcq~LrenderedMessageq~L; line: 1, column: 47]>, :data=>"Z\u0000\u0015mdcCopyLookupRequiredZ\u0000\u0011ndcLookupRequiredJ\u0000\ttimeStampL\u0000\fcategoryNamet\u0000\u0012Ljava/lang/String;L\u0000\flocationInfot\u0000#Lorg/apache/log4j/spi/LocationInfo;L\u0000\amdcCopyt\u0000\u0015Ljava/util/Hashtable;L\u0000\u0003ndcq\u0000~\u0000\u0001L\u0000\u000FrenderedMessageq\u0000~\u0000\u0001L\u0000"}
2017-09-20T22:29:59.434Z 127.0.0.1 \xAC\xED\u0000\u0005sr\u0000!org.apache.log4j.spi.LoggingEvent\xF3\xF2\xB9#t\v\xB5?\u0003\u00002017-09-20T22:29:59.455Z 127.0.0.1 ZmdcCopyLookupRequiredZndcLookupRequiredJ  timeStampL
categoryNametLjava/lang/String;L
locationInfot#Lorg/apache/log4j/spi/LocationInfo;LmdcCopytLjava/util/Hashtable;Lndcq~LrenderedMessageq~L^C[2017-09-20T18:30:17,549][WARN ][logstash.runner          ] SIGINT received. Shutting down the agent.
[2017-09-20T18:30:17,559][WARN ][logstash.agent           ] stopping pipeline {:id=>"main"}
like image 326
Shankar Guru Avatar asked Feb 27 '26 17:02

Shankar Guru


1 Answers

As the log4j API doc states:

SocketAppenders do not use a layout. They ship a serialized LoggingEvent object to the server side.

Logstash at the other end of the socket expects messages according to the used codec. As far I know there is no ready to use Appender for this purpose on Java nor a fitting codec on logdash side.

like image 174
blafasel Avatar answered Mar 02 '26 07:03

blafasel