I get the following error when I want to send an AVRO message which contains a field that has the type long:
Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id 61
Caused by: java.lang.ClassCastException: java.lang.Long cannot be cast to org.joda.time.DateTime
I use Confluent 3.2.0 and Apache Spark 2.2.0. This error is thrown in a Spark Job which processes AVRO messages and prints them in a console. In the AVRO schema, the corresponding field is defined like this:
{\"name\": \"event_time\", \"type\": { \"type\" : \"long\", \"logicalType\": \"timestamp-millis\"}}
In the Java class generated from the .avsc
file, the field is defined as below:
private DateTime event_time;
We think Avro is the best choice for a number of reasons: It has a direct mapping to and from JSON. It has a very compact format. The bulk of JSON, repeating every field name with every single record, is what makes JSON inefficient for high-volume usage.
The avro converter allows us to convert the Apache avro object into a favored data format such as JSON, XML, CSV, etc. It has been generated for assisting data conversion and serialization depending on the Apache avro technology.
I encountered a similar issue using Confluent 4.0.0 and Avro 1.8.2. I had a stream processor that was attempting to convert a long to a DateTime. I overcame the issue by adding the correct conversion. Before I begin any processing logic, I used the Specific Data static utilities class and adding the correct Logical type conversion.
SpecificData.get().addLogicalTypeConversion(new TimeConversions.TimestampConversion());
In Avro 1.9.X and above, the symbol TimestampConversion
is not present anymore. replacing the above from @user3222582 with TimestampMillisConversion
fixed a compilation error for me with avro 1.9.2
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With