We are considering using Protocol Buffers for communicating between a python & a node.js service that each live in their own repos.
Since the .proto
files must be accessible to both repos, how should we share the .proto
files?
We are currently considering:
.proto
files, and making it a git subtree of all our services.proto
files, publishing both a private python module and private node module on push, and requiring the modules from the respective services.proto
files, and specifying the repository as the destination of a pip
/ npm
packageWhat is the standard way to share .proto
files between repositories?
You can share your Proto.io project using Share links, that is, links that others can access. You can choose to share your prototype as: Live Version: Share a live version of your prototype. Any changes made to your prototype will be visible to the people who access this Share link.
You can certainly send even a binary payload with an HTTP request, or in an HTTP response. Just write the bytes of the protocol buffer directly into the request/response, and make sure to set the content type to "application/octet-stream". The client, and server, should be able to take care of the rest easily.
So long as you're careful about when and how you change and remove fields, your protobuf will be forward and backward compatible.
This depends on your development process.
A git subtree / submodule seems like a sensible solution for most purposes. If you had more downstream projects, publishing a ready-made module would make sense, as then the protobuf generator wouldn't be needed for every project.
We, in the same situation, used 3 repos: server-side was written in c++, client-side in actionscript 3, and protobufs was in the third, and was used both of them. For a big team, and big project I think it was a good choice.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With