Imagine you have the following Pipeline:
Job A (deploy) -> Job B (test) -> Job C (remove test deployment)
The pipeline should deploy a test image and test it after a successful deployment. After the test I want to run a cleanup script regardless of the test output, but only if the test image (Job A) was deployed.
To summarize this: I want Gitlab to execute Job C only if Job A succeeds, but after Job B.
Things that won't work:
when: on-failure
(Job A or Job B could failed, but only Job A is important)when: always
(maybe Job A failed which causes Job C to fail)when: on-success
(requires all jobs to succeed)I know that GitLab has a feature called DAG Pipelines which allow you to specify multiple dependencies on other jobs with the needs
keyword, but sadly the when
keyword is always scoped to all prior jobs. So you are not able to say something like:
when:
on-success: job-a
always: job-b
Do I miss something or is there no way to achieve such a behaviour?
yml` file | GitLab.
To run our tests in parallel as part of our GitLab CI/CD pipeline, we need to tell GitLab to start multiple test jobs at the same time. In each of the test jobs we then run a different subset of our test suite. For our example project we start by adding separate build , test and deploy jobs.
Run jobs for scheduled pipelines To configure a job to be executed only when the pipeline has been scheduled, use the rules keyword.
One way to allow more jobs to run simultaneously is to simply register more runners. Each installation of GitLab Runner can register multiple distinct runner instances. They operate independently of each other and don't all need to refer to the same coordinating server.
The needs
DAG field can be used to conditionally execute the cleanup (Job C), if Job B fails or succeeds, but NOT when it is skipped because Job A failed.
Create 2 cleanup jobs that match the following boolean conditions:
(Job A succeeds and Job B succeeds)
: If all previous tasks succeed (Job A and Job B), we can run the cleanup with when: on_success
. However, this will not trigger if Job A succeeds and Job B fails.(Job A succeeds and Job B fails)
: To circumvent the previous scenario with an untriggered cleanup (Job C), we make use of the fact that if Job B fails, this implies that Job A succeeded in the pipeline. By creating a duplicate cleanup task and specifying a needs
tag on Job B and when: on_failure
, the cleanup task will only run if Job A succeeds and Job B fails.To reiterate: a cleanup job will run if (Job A succeeds and Job B succeeds)
or (Job A succeeds and Job B fails)
, which by boolean expression reduction is equivalent to (Job A succeeds)
.
An obvious caveat here is that there are now 2 cleanup jobs that are displayed in the pipeline; however, they are mutually exclusive and only one could ever be executed.
Here is a sample configuration:
stages:
- deploy
- test
- cleanup
deploy_job:
stage: deploy
script:
- echo Deployed
- "true"
when: always
test_job:
stage: test
script:
- echo Executing tests
- "true"
when: on_success
# a YAML anchor reduces repetition
.cleanup_job: &cleanup_job
stage: cleanup
script:
- echo Cleaned up deployment
cleanup_deployment_success:
when: on_success
<<: *cleanup_job
cleanup_deployment_failure:
needs: ["test_job"]
when: on_failure
<<: *cleanup_job
With various intentional fail conditions, the following pipeline states are produced:
Logically, this indicates that regardless of whether Job B succeeded or failed, Job C runs if Job A succeeded. Furthermore, the failure state is preserved in the overall pipeline.
The needs DAG field can be used to conditionally execute the cleanup (Job
C
), if JobB
fails or succeeds, but NOT when it is skipped because JobA
failed.
That might have changed with GitLab 13.11 (April 2021)
Optional DAG ('needs:') jobs in CI/CD pipelines
The directed acyclic graph (DAG) in GitLab CI/CD lets you use the
needs
syntax to configure a job to start earlier than its stage (as soon as dependent jobs complete). > We also have therules
,only
, orexcept
keywords, which determine if a job is added to a pipeline at all.Unfortunately, if you combine
needs
with these other keywords, it’s possible that your pipeline could fail when a dependent job does not get added to a pipeline.In this release, we are adding the
optional
keyword to theneeds
syntax for DAG jobs.
- If a dependent job is marked as
optional
but not present in the pipeline, theneeds
job ignores it.- If the job is
optional
and present in the pipeline, theneeds
job waits for it to finish before starting.This makes it much easier to safely combine
rules
,only
, andexcept
with the growing popularity of DAG.See Documentation and Issue.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With