I couldn't find any "Best Practices" online for usage of gRPC and protobuf within a project. I'm implementing an event-sourced server side app. The core defines the domain aggregates, events and services without having external dependencies. The gRPC server calls the core services passing in request objects which eventually translates into events being published. Events are serialized using protobuf and published on the wire. We're currently in a dilemma on whether our events should be the protobuf generated classes directly, or should we keep the core and events separate and implement a mapper/serializer layer to translate events between protobuf <-> core
If there's another approach we're not considering, please guide us :)
Thanks for the help.
Domain Model Objects and Data Transfer Objects (Protobuf Message) should be separated as much as possible. For this the best way is to transform your Domain Model Objects into Google Protobuf Messages and vice versa. We've made a protobuf-converter to make it extremely simple.
Protobufs are really good for serialization and backwards compatibility, but not so good at being first class Java objects. Adding custom functionality to protos is currently not possible. You can get a lot of the benefits by using Protobufs at the stub layer, wrap them in one of your event Pojos, and pass them around internally as such:
public final class Event {
private final EventProto proto;
public void foo() {
// do something with proto.
}
}
Most projects don't change their .proto file that often, and almost never in a backwards incompatible way (neither wire nor API). Having to change a lot of code because of proto changes has never been a problem in my experience.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With