Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Converting proto buffer to ProtoRPC

Tags:

In a Python script, mylibrary.py, I use Protocol Buffers to model data using the following approach:

  • Defining message formats in a .proto file.
  • Use the protocol buffer compiler.
  • Use the Python protocol buffer API to write and read messages in the .py module.

I want to implement Cloud Endpoints Framework on App Engine that imports and uses the aforementioned Python script, however Cloud Endpoints uses ProtoRPC, not 'standard' Protocol Buffers.

My App Engine Python module, main.py, imports from protorpc rather than using the 'offline' protoc compiler to generate serialization and deserialization code:

from protorpc import messages
from protorpc import remote

Messages are not defined​ using .proto files. Instead, classes are defined, inheriting from protorpc.messages.Message:

class MyMessageDefinition(messages.Message)

Can Proto Buffers be converted to Proto RPC equivalents? I don't really want to change mylibrary.py to use ProtoRPC, since it's less generic than Protocol Buffers.

like image 503
Jack Avatar asked Mar 25 '17 17:03

Jack


1 Answers

After eight months and lots experimentation, I'll add my opinion. I hope it saves someone time.

Choose Your Framework First

There are different Cloud Endpoint offerings from Google Cloud. All can be used for JSON/REST APIs. This wasn't immediately clear to me. Cloud Endpoints is a very high-level phrase covering development, deployment and management of APIs on multiple Google Cloud backends.

The point here is that after deciding to use Cloud Endpoints, you must still decide on backend technologies to serve your API. The documentation feels a little hidden away, but I strongly recommend starting with the Google Cloud Endpoints doc.

You can choose between:

  1. OpenAPI Specification
  2. Endpoints Frameworks
  3. gRPC

Choose Your Implementation Second

Within each API Framework there’s a choice of Cloud implementations upon which your API (service) can run:

OpenAPI Specification - for JSON/REST APIs implemented on:

  • Google App Engine flexible environment
  • Google Compute Engine
  • Google Container Engine
  • Kubernetes

Endpoints Frameworks - for JSON/REST APIs implemented on:

  • Google App Engine standard environment with Java
  • Google App Engine standard environment with Python

gRPC - for gRPC APIs implemented on:

  • Google Compute Engine
  • Google Container Engine
  • Kubernetes

When posting question here, I was using Endpoints Frameworks running on Google App Engine standard environment with Python. I then migrated my API (service) to gRPC on Google Compute Engine.

The observant among you may notice both the OpenAPI Specification and Endpoints Frameworks can be used for JSON/REST APIs, while gRPC only exposes a gRPC API. So how did I port my REST API from Endpoints Frameworks to gRPC? The answer is Transcoding HTTP/JSON to gRPC (which I learnt along the way, and was not immediately clear to me). So, don't rule out gRPC just because you want REST/HTTP.

The Answer

So how does this related to my original question?

That I was trying to convert between .proto files and gRPC annotations at all, meant I had taken a wrong-turning along the way.

If you want to write an application using plain .proto files, then choose gRPC on Compute Engine. If you need this to be a REST API, this can be done, but you'll need to add an ESP into your backend configuration. It's pretty much an NGINX sever setup as a reverse proxy. The only downside here is you'll need some Docker knowledge to ensure the ESP (proxy) and your gRPC server can communicate (Docker networking).

If your code is already on an App Engine, and you want to expose it as a REST API with minimum effort and still get good API management features, choose Endpoints Frameworks. Warning: I moved away from this because it was prohibitively expensive (I was getting billed in the region of $100 USD monthly).

If you want to avoid .protos altogether, then go with OpenAPI Specification.

Lastly, if you want to offer programmatic integration, client libraries, or you want to offer a microservice, then really do consider gRPC. It's easy to remove the ESP (proxy) and run a gRPC server on nearly any machine (as long as the Protocol Buffer Runtime is installed.

Ultimately I settled on gRPC on Compute Engine with Docker. I also have an ESP to provide a HTTP transcoding to gRPC and vice-versa. I like this approach for a few reasons:

  1. You learn a lot: Docker, Docker Networking, NGINX configuration, Protocol Buffers, ESP (Cloud Proxy), gRPC servers.
  2. The service (core business) logic can be written with plain-old gRPC. This allows the service to be run on any machine without a web server. Your business logic, is the server :)
  3. Protocol Buffers / gRPC are excellent for isolating business logic as a service...or microservice. They're also good for providing well-defined interfaces and libraries.

Avoid These Mistakes

  • Implementing the first framework / architecture you find. If I could start again, I would not choose Endpoints Frameworks. It's expensive, and uses annotations rather than .proto files, which, IMO, makes the code harder to port.

  • Read Always Free Usage Limits before deciding upon a framework and implementation. Endpoints Frameworks uses backend App Engine instances - which have almost no free quota. Confusing, frontend App Engine instances have a very generous free quota.

  • Consider Local Development. Cloud Endpoints local development servers are not officially supported (at least they weren't at the time of my question). Conversely there's a whole page on Running a Local Extensible Service Proxy.

like image 185
Jack Avatar answered Sep 24 '22 11:09

Jack