Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Canonical way to transmit protocol buffers over network?

I'm trying to use Google's protocol buffers (protobuf) with Python in a networked program, using bare sockets. My question is: after the transmitting side sends a message, how does the receiving side knows what kind of message was transmitted? For example, say I have message definitions:

message StrMessage {
    required string str = 1;
}

message IntMessage {
    required int32 num = 1;
}

Now the transmitter makes a StrMessage, serializes it, and sends the serialized bytes over the network. How does the receiver know to deserialize the bytes with StrMessage rather than IntMessage? I've tried doing two things:

// Proposal 1: send one byte header to indicate type
enum MessageType {
    STR_MESSAGE = 1;
    INT_MESSAGE = 2;
}

// Proposal 2: use a wrapper message
message Packet {
    optional StrMessage m_str = 1;
    optional IntMessage m_int = 2;
}

Neither of these seems very clean, though, and both require me to list all the message types by hand. Is there a canonical/better way to handle this problem?

Thanks!

like image 797
fyhuang Avatar asked Aug 15 '12 06:08

fyhuang


1 Answers

This has been discussed before, for example this thread on the protobuf list, but simply there is no canonical / de-facto way of doing this.

Personally, I like the Packet approach, as it keeps everything self-contained (and indeed, in protobuf-net I have specific methods to process data in that format, just returning the StrMessage / IntMessage, leaving the Packet layer as an unimportant implementation detail that never actually gets used), but since that previous proposal by Kenton never got implemented (AFAIK) it is entirely a matter of personal taste.

like image 156
Marc Gravell Avatar answered Oct 10 '22 01:10

Marc Gravell