Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Triggering a build for a specific git commit via gcloud commandline tool

Tags:

All the examples that I've come across have been of the following format:

gcloud container builds submit --config cloudbuild.yaml .

The man-page says the following:

 [SOURCE]
    The source directory on local disk or tarball in Google Cloud Storage
    or disk to use for this build. If source is a local directory this
    command skips files specified in the .gcloudignore file (see $ gcloud
    topic gcloudignore for more information).

Now, the source-directory on my local disk is very large and a lot of time is being spent in transferring the source code from my local machine to the Google build servers/cloud. Is either of the following possible? How?

  • Give a git/github URL instead of local source-directory
  • My git-repo is incidentally being mirrored in Google Source Repository as well, because I have setup build triggers for my repo. Can I give a URL to the repo being mirrored by Google?
like image 348
Saurabh Nanda Avatar asked Feb 24 '18 11:02

Saurabh Nanda


People also ask

Which triggers enables you to execute your build as soon as the code changes?

Repository event triggers. Cloud Build enables you to automatically execute builds on repository events such as pushes or pull requests. You can connect external repositories, such as repositories in GitHub or Bitbucket, to Cloud Build or use code in Cloud Source Repositories for your builds.

What is manual build trigger?

Manual triggers enable you to manually invoke builds by: Fetching source code from a hosted repository with a specified branch or tag. Parametizing your build with substitutions that don't need to be passed in manually each time you execute a build.


2 Answers

Unfortunately there isn't great support for this today in gcloud. You can accomplish this a few other ways though:

  1. Use curl or the client library of your choice to send an API request to request a build that specifies a RepoSource. For example:

    { "source": { "repoSource": { "repoName": "my-repo", "commitSha": "deadbeef" } }, "steps": [...] }

  2. In your local environment, fetch the commit and build it using gcloud:

    git checkout && gcloud container builds submit . --config=cloudbuild.yaml

  3. Create a trigger that automatically executes your build, then issue an API request to run the trigger manually, on the specific commit you want, again using curl or a client library.

like image 121
Jason Hall Avatar answered Sep 23 '22 12:09

Jason Hall


If you are building Docker images you can use a cached image present in your container registry to build upon. If you only have made changes to the last layers of the build you can actually avoid transferring most of the data and mostly build only the changes.

As in the linked example, you can add a --cache-from to the .yaml file selecting the image on your Google container registry on to build on:

steps:
- name: 'gcr.io/cloud-builders/docker'
  args: ['pull', 'gcr.io/$PROJECT_ID/latest-image']
- name: 'gcr.io/cloud-builders/docker'
  args: [
            'build',
            '--cache-from',
            'gcr.io/$PROJECT_ID/latest-image',
            '-t', 'gcr.io/$PROJECT_ID/latest-image',
            '.'
        ]
images: ['gcr.io/$PROJECT_ID/latest-image']

Then, the command to build:

gcloud container builds submit --config cloudbuild.yaml .

This should avoid you quite a bit of transfer time.

like image 39
DevopsTux Avatar answered Sep 23 '22 12:09

DevopsTux