In the documentation of the Azure pipelines, I read that:
Each agent can run only one job at a time. To run multiple jobs in parallel you must configure multiple agents.
When you run a pipeline on a self-hosted agent, by default, none of the sub-directories are cleaned in between two consecutive runs. As a result, you can do incremental builds and deployments, provided that tasks are implemented to make use of that. You can override this behavior using the workspace setting on the job.
Pipeline artifacts provide a way to share files between stages in a pipeline or between different pipelines. They are typically the output of a build process that needs to be consumed by another job or be deployed.
As a beginner, after reading this, I have some doubts:
Pipeline.Workspace
? ( It is clear that in the parallel job run case, it would need multiple agents).Can some please help me to clear these doubts?
There is a maximum limit of 25 parallel jobs for Microsoft-hosted agents. Starting with Azure DevOps Server 2019, you do not have to pay for self-hosted concurrent jobs in releases.
To trigger a pipeline upon the completion of another pipeline, configure a pipeline resource trigger. The following example configures a pipeline resource trigger so that a pipeline named app-ci runs after any run of the security-lib-ci pipeline completes. This example has the following two pipelines.
If I have 2 jobs (2nd job runs after the 1st) in an azure-pipelines.yaml, will both the jobs run in the same agent?
Strictly speaking, NO, neither UI
nor YAML
can achieve that.
As what you saw from the document: Each agent can run only one job at a time. Our design logic is, in theory, one job is an independent running individual, the communication between different jobs requires the use of "middleware", like variable
, artifact
and etc.
Do different jobs in the same pipeline share the same workspace that can be referenced via the variable Pipeline.Workspace?
Sample1:
I have 2 jobs in one pipeline, one is job01
and another is job02
.
In job01
, I create one json file into $(Pipeline.Workspace)
names project.json
:
In job02, print the files list that under path $(Pipeline.Workspace)
:
You can see, the second job could not access the output directory of first job.
BUT, there has one special case, that the pipeline is running with self-agent pool and only one agent exists in that pool.
At this time, they can run on same agent since there is only one agent in the pool. And, if you do not manually perform clean
operation in the job definition, files can be shared between jobs in this special scenario because they are using a constant local path.
Sample2:
Same with previous sample, but this time change the running pool to the one that only contain 1 agent.
The 1st job generates some files in one step. Is it possible to consume those files in 2nd job without using artifacts
I think my above special case description and sample2
has answer this.
Yes, it possible. You can refer to that to achieve this demand. But, mostly we recommend you use artifacts to pass files between jobs.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With