Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to pass script arguments to AWS Batch fetch-and-run

I am following this tutorial to run a simple fetch-and-run example in AWS batch. However, I'm unable to pass arguments to the script fetched through this example.

The basic example will produce this execution:

export BATCH_FILE_TYPE="script"

export BATCH_FILE_S3_URL="s3://my-bucket/my-script"

fetch_and_run.sh script-from-s3 [ <script arguments> ]

where script arguments are only mentioned in:

This shows that it supports two values for BATCH_FILE_TYPE, either “script” or “zip”. When you set “script”, it causes fetch_and_run.sh to download a single file and then execute it, in addition to passing in any further arguments to the script.

I tried passing them with AWS CLI through the --parameters and --container-overrides parameters (in the latter under the command key), however they are not received from the script.

I would like not to modify either my Dockerfile ENTRYPOINT for each run or the fetch_and_run.sh script, but I cannot understand how to achieve this differently.

like image 502
gc5 Avatar asked Oct 11 '18 19:10

gc5


1 Answers

Mixing these example of job definitions I achieved it with aws batch using:

aws batch submit-job --job-name <job_name> --job-definition <job_def_name> \
  --job-queue <queue_name> \
  --container-overrides '{
    "command": ["<script-from-s3>", "Ref::param1", "Ref::param2"], \
    "environment": [ \
      {"name": "BATCH_FILE_S3_URL", "value": "<script-from-s3>"}, \
      {"name": "BATCH_FILE_TYPE", "value": "script"}]}' \
  --parameters '{"param1": "<param1>", "param2": "<param2>"}'
like image 180
gc5 Avatar answered Sep 24 '22 14:09

gc5