TL,DR;
gcloud
to deploy the API to Cloud Endpoints. Detail
I run a Python gRPC service and ESP in Docker containers running on Google Compute Engine. About gRPC > API management shows a diagram of my application architecture:
My high-level build steps:
1) Create the descriptor file, api_descriptor.pb
, using the protoc protocol buffers compiler.
python -m grpc_tools.protoc \
--include_imports \
--include_source_info \
--proto_path=. \
--descriptor_set_out=api_descriptor.pb \
--python_out=generated_pb2 \
--grpc_python_out=generated_pb2 \
bookstore.proto
2) Deploy the proto descriptor file (api_descriptor.pb
) and the configuration file using the gcloud command-line tool:
gcloud endpoints services deploy api_descriptor.pb api_config.yaml
3) Generate gRPC code using Python plugin:
python -m grpc_tools.protoc -I../../protos --python_out=. --grpc_python_out=. ../../protos/helloworld.proto
4) Build the final Docker image to deploy on Google Compute Engine. Resulting Docker image should include:
Step 4) builds the 'gRPC Server' (rightmost blue box in the accompanying diagram) using the following Dockerfile:
FROM gcr.io/google_appengine/python:latest
WORKDIR .
EXPOSE 8081
ENTRYPOINT ["python", "server.py"]
ADD requirements.txt .
ADD protos ./protos
RUN mkdir out
RUN apt-get update && \
apt-get install -y python2.7 python-pip && \
pip install -r requirements.txt
RUN python \
-m grpc_tools.protoc \
--python_out=out \
--grpc_python_out=out \
--proto_path=. \
bookstore.proto
I'm migrating these build steps to Google's Cloud Build.
AFAICT my high-level build steps should map onto Cloud Builder official builder images.
1) ???
2) Use cloud-builders/gcloud/ to run gcloud
commands.
3) ???
4) Use cloud-builders/docker to build 'gRPC Server' Docker image.
Steps 2) and 3) already have cloud builders available (see GoogleCloudPlatform/cloud-builders).
However, I'm unsure how to migrate steps 1) and 3) to Cloud Build. Both steps require running a Python plugin which is not available in a base Linux Docker image.
AFAICT step 1) should produce a Cloud Build artifact for api_descriptor.pb
and save to a Cloud Storage Bucket.
gcloud
to deploy the API to Cloud Endpoints. I got this working a few months ago. I don't know if I did it the "right" way. Judge for yourself :p
TL,DR; If you just want to use protoc
with Google Cloud Build, I have submitted a protoc
builder to the cloud builders community GitHub repository which has been accepted. See cloud-builders-community/protoc.
Detail; My solution relies on creating the protoc
Custom Build Step. This creates a Docker container image the Cloud Build worker pulls and runs when it needs to run protoc
.
You only need two files to create the Custom Build Step, protoc
:
cloudbuild.yaml
- tells Google Cloud Builder how to build a Docker image.Dockerfile
- tells Docker how to build the environment containing the protoc
binary.This was literally my local directory structure to achieve step 1:
.
├── cloudbuild.yaml
└── Dockerfile
The Docker file is where the protoc
command is installed, and is the more complex of the two files:
FROM ubuntu
ARG PROTOC_VERSION=3.6.1
ARG PROTOC_TARGET=linux-x86_64
ARG ASSET_NAME=protoc-${PROTOC_VERSION}-${PROTOC_TARGET}.zip
RUN apt-get -qy update && apt-get -qy install python wget unzip && rm -rf /var/lib/apt/lists/*
RUN echo "${PROTOC_VERSION}/${ASSET_NAME}"
RUN wget https://github.com/google/protobuf/releases/download/v${PROTOC_VERSION}/protoc-${PROTOC_VERSION}-${PROTOC_TARGET}.zip && \
unzip ${ASSET_NAME} -d protoc && rm ${ASSET_NAME}
ENV PATH=$PATH:/protoc/bin/
ENTRYPOINT ["protoc"]
CMD ["--help]
Breaking this down:
apt-get
, wget
, unzip
, and rm
:FROM ubuntu
--build-arg <varname>=<value>
flag:ARG PROTOC_VERSION=3.6.1
ARG PROTOC_TARGET=linux-x86_64
ARG ASSET_NAME=protoc-${PROTOC_VERSION}-${PROTOC_TARGET}.zip
apt-get -qy update
to "resynchronize the package index files from their sources". q
omits progress indicators, y
assumes yes as an answer to any prompts encountered:RUN apt-get -qy update
RUN apt-get -qy install python wget unzip
RUN rm -rf /var/lib/apt/lists/*
The previous three RUN instructions can be combined into one:
RUN apt-get -qy update && apt-get -qy install python wget unzip && rm -rf /var/lib/apt/lists/*
PATH
environment to include the location of the protoc
binary in the final environment (image). ENV PATH=$PATH:/protoc/bin/
Set the ENTRYPOINT of the image such that the image runs as a protoc
executable. Not, since the previous step added protoc
to $PATH
, we need only specify the binary to run (not the full path):
ENTRYPOINT ["protoc"]
protoc
image, protoc --help
will run:CMD ["--help]
That's all we need to define an executable protoc
Docker image. However, it's not yet a Custom Build Step that can be used in Google's Cloud Build environment. We must define the custom build step using cloudbuild.yaml
:
steps:
- name: 'gcr.io/cloud-builders/docker'
args:
[
'build',
'--tag',
'gcr.io/$PROJECT_ID/protoc',
'--cache-from',
'gcr.io/$PROJECT_ID/protoc',
'.',
]
images: ['gcr.io/$PROJECT_ID/protoc']
This file will generate an artefact gcr.io/my-cloud-project-id/protoc
which can be used to run protoc
in Google Cloud Build. Example usage of this custom build step:
steps:
- name: 'gcr.io/$PROJECT_ID/protoc'
args:
[
'--include_imports',
'--include_source_info',
'--proto_path',
'.',
'--descriptor_set_out',
'api_descriptor.pb',
'v1/my-api-proto.proto',
]
Cloud Build automatically replaces $PROJECT_ID with your project ID, so, the name will reference the artefact: gcr.io/my-cloud-project-id/protoc
. Since this is an executable Docker image (defined with ENTRYPOINT ["protoc"]
), it's equivalent to running locally:
protoc --include_imports --include_source_info --proto_path . --descriptor_set_out api_descriptor.pb v1/my-api-proto.proto
So, in answer to my question, both 1) and 3) can use the protoc
custom build step to run in Google Cloud Build.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With