I'm using SLF4J with Logback in a JAX-RS application... I want to log to JSON in such a way that my message is not encoded again but printed raw into the logfile:
At the moment it looks like this:
{"@timestamp":1363834123012,"@message":"{\"text\":\"From MLK to Barack
Ob...\n\"}"
But I want to have this:
{"@timestamp":1363834123012,"@message": { "text ": "From MLK to Barack
Ob...\n\}
The reason is I want to parse the JSON again and want to avoid the unescaping of the data.
I've written a custom logback encoder but I found no way to avoid the escaping. Can I pass a object to logback and change the settings based on the type of the object?
Edit: I've found a way - not exactly elegant - as requested a SSCE:
In my Application
// SLF4J Logger
private static Logger logger = LoggerFactory.getLogger(MyClass.class);
// A logback? Marker
private Marker foo = MarkerFactory.getMarker("foo");
// Jackson ObjectMapper()
ObjectMapper mapper = new ObjectMapper();
// Log something...
logger.info(foo, mapper.writeValueAsString(json));
I've used a variation of the Logstash-Encoder found here: https://github.com/logstash/logstash-logback-encoder
package my.package;
import static org.apache.commons.io.IOUtils.*;
import java.io.IOException;
import java.util.Map;
import java.util.Map.Entry;
import org.codehaus.jackson.JsonGenerator.Feature;
import org.codehaus.jackson.JsonNode;
import org.codehaus.jackson.map.ObjectMapper;
import org.codehaus.jackson.node.ObjectNode;
import org.slf4j.Marker;
import ch.qos.logback.classic.spi.ILoggingEvent;
import ch.qos.logback.classic.spi.IThrowableProxy;
import ch.qos.logback.classic.spi.ThrowableProxyUtil;
import ch.qos.logback.core.CoreConstants;
import ch.qos.logback.core.encoder.EncoderBase;
public class JsonEncoder extends EncoderBase<ILoggingEvent> {
private static final ObjectMapper MAPPER = new ObjectMapper().configure(
Feature.ESCAPE_NON_ASCII, true);
private static Marker M;
private boolean immediateFlush = true;
@Override
public void doEncode(ILoggingEvent event) throws IOException {
M = event.getMarker();
ObjectNode eventNode = MAPPER.createObjectNode();
eventNode.put("@timestamp", event.getTimeStamp());
//
if (M != null) {
if (M.getName().equals("foo")) {
JsonNode j = MAPPER.readTree(event.getFormattedMessage());
eventNode.put("@foo", j);
}
} else {
eventNode.put("@message", event.getFormattedMessage());
}
eventNode.put("@fields", createFields(event));
write(MAPPER.writeValueAsBytes(eventNode), outputStream);
write(CoreConstants.LINE_SEPARATOR, outputStream);
if (immediateFlush) {
outputStream.flush();
}
}
private ObjectNode createFields(ILoggingEvent event) {
// not important here
return fieldsNode;
}
@Override
public void close() throws IOException {
write(LINE_SEPARATOR, outputStream);
}
public boolean isImmediateFlush() {
return immediateFlush;
}
public void setImmediateFlush(boolean immediateFlush) {
this.immediateFlush = immediateFlush;
}
}
It's works now! Yeah! But I guess it's not the best way to do it (serialize, deserialize the JSON...)
The JSON format is ubiquitous and used everywhere from logging in web apps to message passing for microcontrollers. Thanks to its compact nature and it being easily readable by humans, JSON has become the de facto standard for sharing structured data.
To enable parsing JSON log, you add parse: json to a pipeline in the ClusterLogForwarder CR, as shown in the following example. When you enable parsing JSON logs by using parse: json , the CR copies the JSON-structured log entry in a structured field, as shown in the following example.
We'll see how to do this for the two most widely used logging libraries: Log4j2 and Logback. Both use Jackson internally for representing logs in the JSON format. For an introduction to these libraries take a look at our introduction to Java Logging article.
If you have a Json formatted messages, the upper solutions work, but are not so nice, since you don´t want to call a logstash specific code, each time you use your logger in the code.
Just adding a
net.logstash.logback.encoder.LogstashEncoder
is not enough, since the message itsself stays escaped. To solve this, try the following in your logback.xml:
<encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
<providers>
<timestamp/>
<version/>
<loggerName/>
<pattern>
<pattern>
{
"jsonMessage": "#asJson{%message}"
}
</pattern>
</pattern>
</providers>
</encoder>
The #asJson pattern will unescape your message.
Use the RawJsonAppendingMarker:
log.trace(net.logstash.logback.marker.Markers.appendRaw("jsonMessage", jsonString), null);
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With