I have an existing AWS Steps orchestration that is executing a AWS Batch job via lambdas. However AWS have recently added the ability to directly invoke other services like AWS Batch from a step. I am keen to use this new functionality but cannot get it working.
https://docs.aws.amazon.com/step-functions/latest/dg/connectors-batch.html
So my new step operation that I want to use to invoke Batch.
"File Copy": {
"Type": "Task",
"Resource": "arn:aws:states:::batch:submitJob.sync",
"Parameters": {
"JobName": "MyBatchJob",
"JobQueue": "MySecondaryQueue",
"ContainerOverrides.$": "$.lts_job_container_overrides",
"JobDefinition.$": "$.lts_job_job_definition",
},
"Next": "Upload Start"
}
Note that I am trying to use the $. JSONpath syntax in order to dynamically have parameters passed through the steps.
When given the following inputs
"lts_job_container_overrides": {
"environment": [
{
"name": "MY_ENV_VARIABLE",
"value": "XYZ"
},
],
"command": [
"/app/file_copy.py"
]
},
"lts_job_job_definition": "MyBatchJobDefinition"
I was expected that the environment and command values would be passed through to the corresponding parameter (ContainerOverrides) in AWS Batch. Instead, it appears that AWS Steps is trying to promote them up as top level parameters - and then complaining that they are not valid.
{
"error": "States.Runtime",
"cause": "An error occurred while executing the state 'File Copy'
(entered at the event id #29). The Parameters
'{\"ContainerOverrides\":{\"environment\":
[{\"name\":\"MY_ENV_VARIALBE\",\"value\":\"XYZ\"}],\"command\":
[\"/app/file_copy.py\"]},\"JobDefinition\":\"MyBatchJobDefinition\"}'
could not be used to start the Task: [The field 'environment' is not
supported by Step Functions, The field 'command' is not supported by
Step Functions]"
}
How can I stop AWS Steps from attempting to interpret the values I am trying to pass through to AWS Batch?
I have tried taking JSON path out of the mix and just specifying the ContainerProperties statically (even though this long term won't be a solution). But even then I encounter issues.
"ContainerOverrides": {
"environment": [
{
"name": "RUN_ID",
"value": "xyz"
}
],
"command": "/app/file_copy.py"
}
In this case steps itself rejects the definition file on load.
Invalid State Machine Definition: 'SCHEMA_VALIDATION_FAILED: The field
'environment' is not supported by Step Functions at /States/File
Copy/Parameters, SCHEMA_VALIDATION_FAILED: The field 'command' is not
supported by Step Functions at /States/File Copy/Parameters'
So it just appears that ContainerOverrides is problematic fullstop? Have I misunderstood how it is intended to be used in this scenario?
The above issue has been resolved (as per the answer below) in the AWS Batch documentation - the following note has been added by AWS:
Note
Parameters in Step Functions are expressed in CamelCase, even when the native service API is pascalCase.
This should work, I've tested it seems to be working fine for me. Both Environment
and its object keys and Command
should be first letter capital.
{
"StartAt": "AWS Batch: Manage a job",
"States": {
"AWS Batch: Manage a job": {
"Type": "Task",
"Resource": "arn:aws:states:::batch:submitJob.sync",
"Parameters": {
"JobName": "test",
"JobDefinition": "jobdef",
"JobQueue": "testq",
"ContainerOverrides": {
"Command": [
"/app/file_copy.py"
],
"Environment": [
{
"Name": "MY_ENV_VARIABLE",
"Value": "XYZ"
}
]
}
},
"End": true
}
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With