Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Share files between azure pipeline Jobs

In the documentation of the Azure pipelines, I read that:

Each agent can run only one job at a time. To run multiple jobs in parallel you must configure multiple agents.

When you run a pipeline on a self-hosted agent, by default, none of the sub-directories are cleaned in between two consecutive runs. As a result, you can do incremental builds and deployments, provided that tasks are implemented to make use of that. You can override this behavior using the workspace setting on the job.

Pipeline artifacts provide a way to share files between stages in a pipeline or between different pipelines. They are typically the output of a build process that needs to be consumed by another job or be deployed.

As a beginner, after reading this, I have some doubts:

  1. If I have 2 jobs (2nd job runs after the 1st) in an azure-pipelines.yaml, will both the jobs run in the same agent? Do different jobs in the same pipeline share the same workspace that can be referenced via the variable Pipeline.Workspace? ( It is clear that in the parallel job run case, it would need multiple agents).
  2. The 1st job generates some files in one step. Is it possible to consume those files in 2nd job without using artifacts (normally using artifacts, 1st job publishes the artifact and 2nd job downloads it)?

Can some please help me to clear these doubts?

like image 788
AnjanaAK Avatar asked Mar 04 '20 07:03

AnjanaAK


People also ask

How many deployment jobs can be run concurrently by a single agent?

There is a maximum limit of 25 parallel jobs for Microsoft-hosted agents. Starting with Azure DevOps Server 2019, you do not have to pay for self-hosted concurrent jobs in releases.

How do you trigger a pipeline from another pipeline Azure DevOps?

To trigger a pipeline upon the completion of another pipeline, configure a pipeline resource trigger. The following example configures a pipeline resource trigger so that a pipeline named app-ci runs after any run of the security-lib-ci pipeline completes. This example has the following two pipelines.


1 Answers

If I have 2 jobs (2nd job runs after the 1st) in an azure-pipelines.yaml, will both the jobs run in the same agent?

Strictly speaking, NO, neither UI nor YAML can achieve that.

As what you saw from the document: Each agent can run only one job at a time. Our design logic is, in theory, one job is an independent running individual, the communication between different jobs requires the use of "middleware", like variable, artifact and etc.

Do different jobs in the same pipeline share the same workspace that can be referenced via the variable Pipeline.Workspace?

Sample1:

I have 2 jobs in one pipeline, one is job01 and another is job02.

In job01, I create one json file into $(Pipeline.Workspace) names project.json:

enter image description here

In job02, print the files list that under path $(Pipeline.Workspace):

enter image description here

You can see, the second job could not access the output directory of first job.


BUT, there has one special case, that the pipeline is running with self-agent pool and only one agent exists in that pool.

At this time, they can run on same agent since there is only one agent in the pool. And, if you do not manually perform clean operation in the job definition, files can be shared between jobs in this special scenario because they are using a constant local path.

Sample2:

Same with previous sample, but this time change the running pool to the one that only contain 1 agent.

enter image description here

The 1st job generates some files in one step. Is it possible to consume those files in 2nd job without using artifacts

I think my above special case description and sample2 has answer this.

Yes, it possible. You can refer to that to achieve this demand. But, mostly we recommend you use artifacts to pass files between jobs.

like image 121
Mengdi Liang Avatar answered Nov 05 '22 03:11

Mengdi Liang