I have this avro schema
{
"namespace": "xx.xxxx.xxxxx.xxxxx",
"type": "record",
"name": "MyPayLoad",
"fields": [
{"name": "filed1", "type": "string"},
{"name": "filed2", "type": "long"},
{"name": "filed3", "type": "boolean"},
{
"name" : "metrics",
"type":
{
"type" : "array",
"items":
{
"name": "MyRecord",
"type": "record",
"fields" :
[
{"name": "min", "type": "long"},
{"name": "max", "type": "long"},
{"name": "sum", "type": "long"},
{"name": "count", "type": "long"}
]
}
}
}
]
}
Here is the code which we use to parse the data
public static final MyPayLoad parseBinaryPayload(byte[] payload) {
DatumReader<MyPayLoad> payloadReader = new SpecificDatumReader<>(MyPayLoad.class);
Decoder decoder = DecoderFactory.get().binaryDecoder(payload, null);
MyPayLoad myPayLoad = null;
try {
myPayLoad = payloadReader.read(null, decoder);
} catch (IOException e) {
logger.log(Level.SEVERE, e.getMessage(), e);
}
return myPayLoad;
}
Now i want to add one more field int the schema so the schema looks like below
{
"namespace": "xx.xxxx.xxxxx.xxxxx",
"type": "record",
"name": "MyPayLoad",
"fields": [
{"name": "filed1", "type": "string"},
{"name": "filed2", "type": "long"},
{"name": "filed3", "type": "boolean"},
{
"name" : "metrics",
"type":
{
"type" : "array",
"items":
{
"name": "MyRecord",
"type": "record",
"fields" :
[
{"name": "min", "type": "long"},
{"name": "max", "type": "long"},
{"name": "sum", "type": "long"},
{"name": "count", "type": "long"}
]
}
}
}
{"name": "agentType", "type": ["null", "string"], "default": "APP_AGENT"}
]
}
Note the filed added and also the default is defined. The problem is that if we receive the data which was written using the older schema i get this error
java.io.EOFException: null
at org.apache.avro.io.BinaryDecoder.ensureBounds(BinaryDecoder.java:473) ~[avro-1.7.4.jar:1.7.4]
at org.apache.avro.io.BinaryDecoder.readInt(BinaryDecoder.java:128) ~[avro-1.7.4.jar:1.7.4]
at org.apache.avro.io.BinaryDecoder.readIndex(BinaryDecoder.java:423) ~[avro-1.7.4.jar:1.7.4]
at org.apache.avro.io.ResolvingDecoder.doAction(ResolvingDecoder.java:229) ~[avro-1.7.4.jar:1.7.4]
at org.apache.avro.io.parsing.Parser.advance(Parser.java:88) ~[avro-1.7.4.jar:1.7.4]
at org.apache.avro.io.ResolvingDecoder.readIndex(ResolvingDecoder.java:206) ~[avro-1.7.4.jar:1.7.4]
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:152) ~[avro-1.7.4.jar:1.7.4]
at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:177) ~[avro-1.7.4.jar:1.7.4]
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:148) ~[avro-1.7.4.jar:1.7.4]
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:139) ~[avro-1.7.4.jar:1.7.4]
at com.appdynamics.blitz.shared.util.XXXXXXXXXXXXX.parseBinaryPayload(BlitzAvroSharedUtil.java:38) ~[blitz-shared.jar:na]
What i understood from this document that this should have been backward compatible but somehow that doesn't seem to be the case. Any idea what i am doing wrong?
finally i got this working. I need to give both the schemas in the SpecificDatumReader So i modified the parsing like this where i passed both the old and new schema in the reader and it worked like a charm
public static final MyPayLoad parseBinaryPayload(byte[] payload) {
DatumReader<MyPayLoad> payloadReader = new SpecificDatumReader<>(SCHEMA_V1, SCHEMA_V2);
Decoder decoder = DecoderFactory.get().binaryDecoder(payload, null);
MyPayLoad myPayLoad = null;
try {
myPayLoad = payloadReader.read(null, decoder);
} catch (IOException e) {
logger.log(Level.SEVERE, e.getMessage(), e);
}
return myPayLoad;
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With