We have a golang application that submits Argo workflows to a kubernetes cluster upon requests. I'd like to pass a yaml file to one of the steps and I'm wondering what are the options for doing this.
Eventually, I would like to pass this file to the test step as shown in this example:
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: test-
spec:
entrypoint: test
templates:
- name: test
container:
image: gcr.io/testproj/test:latest
command: [bash]
source: |
python test.py --config_file_path=/path/to/config.yaml
The image used in this step would have a python script that receives the path to this file then accesses it.
To submit the Argo workflows with golang, we use the following dependencies:
Thank you.
Argo Workflows lets you define a YAML configuration with multiple steps, representing the steps in your CI/CD pipeline. Each of these steps runs in a separate container within your Kubernetes cluster. Argo uses a CRD called Workflows, which provides a generateName.
To install Argo Workflows, navigate to the releases page and find the release you wish to use (the latest full release is preferred). Scroll down to the Controller and Server section and execute the kubectl commands. You can use Kustomize to patch your preferred configurations on top of the base manifest.
Argo is an open-source container-native workflow engine for Kubernetes. It was introduced by Applatex (owned by Intuit), which offers Kubernetes services and open source products. Argo is a workflow orchestration layer designed to be applied to step-by-step procedures with dependencies.
Workflow parameters are usually small bits of text or numbers. But if your yaml file is reasonably small, you could string-encode it and pass it as a parameter.
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: test-
spec:
entrypoint: test
arguments:
parameters:
- name: yaml
value: "string-encoded yaml"
templates:
- name: test
container:
image: gcr.io/testproj/test:latest
command: [bash]
source: |
# In this case, the string-encoding should be BASH-compatible.
python test.py --config_file_as_string="{{inputs.parameters.message}}"
Argo supports multiple types of artifacts. Perhaps the simplest for your use case is the raw parameter type.
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: test-
spec:
entrypoint: test
templates:
- name: test
inputs:
artifacts:
- name: yaml
path: /path/to/config.yaml
raw:
data: |
this is
the raw file
contents
container:
image: gcr.io/testproj/test:latest
command: [bash]
source: |
python test.py --config_file_path=/path/to/config.yaml
Besides raw
, Argo supports "S3, Artifactory, HTTP, [and] Git" artifacts (among others, I think).
If, for example, you chose to use S3, you could upload the file from your golang app and then pass the S3 bucket and key as parameters.
I'm not familiar with the golang client, but passing parameters is certainly supported, and I think passing in a raw parameter should be supported as well.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With