Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What's the best way to inject a yaml file into an Argo workflow step?

Summary:

We have a golang application that submits Argo workflows to a kubernetes cluster upon requests. I'd like to pass a yaml file to one of the steps and I'm wondering what are the options for doing this.

Environment:

  • Argo: v2.4.2
  • K8s: 1.13.12-gke.25

Additional details:

Eventually, I would like to pass this file to the test step as shown in this example:

apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
  generateName: test-
spec:
  entrypoint: test
  templates:
  - name: test
    container:
      image: gcr.io/testproj/test:latest
      command: [bash]
      source: |
        python test.py --config_file_path=/path/to/config.yaml

The image used in this step would have a python script that receives the path to this file then accesses it.

To submit the Argo workflows with golang, we use the following dependencies:

  • https://github.com/argoproj/argo-workflows/tree/master/pkg/client
  • https://github.com/argoproj/argo-workflows/tree/master/pkg/apis

Thank you.

like image 793
Ash Avatar asked Mar 19 '20 12:03

Ash


People also ask

How does Argo workflow work?

Argo Workflows lets you define a YAML configuration with multiple steps, representing the steps in your CI/CD pipeline. Each of these steps runs in a separate container within your Kubernetes cluster. Argo uses a CRD called Workflows, which provides a generateName.

How do you deploy an Argo workflow?

To install Argo Workflows, navigate to the releases page and find the release you wish to use (the latest full release is preferred). Scroll down to the Controller and Server section and execute the kubectl commands. You can use Kustomize to patch your preferred configurations on top of the base manifest.

What is Kubernetes Argo?

Argo is an open-source container-native workflow engine for Kubernetes. It was introduced by Applatex (owned by Intuit), which offers Kubernetes services and open source products. Argo is a workflow orchestration layer designed to be applied to step-by-step procedures with dependencies.


1 Answers

Option 1: pass the file as a parameter

Workflow parameters are usually small bits of text or numbers. But if your yaml file is reasonably small, you could string-encode it and pass it as a parameter.

apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
  generateName: test-
spec:
  entrypoint: test
  arguments:
    parameters:
    - name: yaml
      value: "string-encoded yaml"
  templates:
  - name: test
    container:
      image: gcr.io/testproj/test:latest
      command: [bash]
      source: |
        # In this case, the string-encoding should be BASH-compatible.
        python test.py --config_file_as_string="{{inputs.parameters.message}}"

Option 2: pass the file as an artifact

Argo supports multiple types of artifacts. Perhaps the simplest for your use case is the raw parameter type.

apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
  generateName: test-
spec:
  entrypoint: test
  templates:
  - name: test
    inputs:
      artifacts:
      - name: yaml
        path: /path/to/config.yaml
        raw:
          data: |
            this is
            the raw file
            contents
    container:
      image: gcr.io/testproj/test:latest
      command: [bash]
      source: |
        python test.py --config_file_path=/path/to/config.yaml

Besides raw, Argo supports "S3, Artifactory, HTTP, [and] Git" artifacts (among others, I think).

If, for example, you chose to use S3, you could upload the file from your golang app and then pass the S3 bucket and key as parameters.

Golang client

I'm not familiar with the golang client, but passing parameters is certainly supported, and I think passing in a raw parameter should be supported as well.

like image 171
crenshaw-dev Avatar answered Oct 21 '22 19:10

crenshaw-dev