I currently have a monorepo with services in subdirectories that I'm leaning towards turning into a multirepo with a metarepo.
One of the reasons I decided to give Azure DevOps a try was someone told me you can have triggers on the subdirectories like:
trigger:
branches:
include:
- master
paths:
include:
- client
Tested and it works.
However, what I'm wondering is if it possible to have multiple independent triggers, or does this require either a polyrepo or multiple .yml
? The reason being if there are only changes in the client
service, it only triggers that set of tests, build, and deployment, while not triggering the api
service to run tests, build, and deploy.
For example:
trigger:
branches:
include:
- master
paths:
include:
- client
stages:
...
Run tests
If tests pass, build and push to ACR
Deploy to AKS
...
trigger:
branches:
include:
- master
paths:
include:
- api
stages:
...
Run tests
If tests pass, build and push to ACR
Deploy to AKS
...
That way, changes in one doesn't cause the entire application to be rebuilt, just what changed.
However, does this require multiple .yml
files (not even sure if anything other than azure-pipelines.yml
is recognized), does this necessitate a polyrepo, or is this doable in a single azure-pipelines.yml
that I'm just not seeing?
If i understand your request correctly. You can achieve this in a single azure-pipeline.yml. Please check below example yml.
trigger:
branches:
include:
- master
paths:
include:
- client/*
- api/*
jobs:
- job: getchangepath
pool:
vmImage: 'windows-latest'
steps:
- powershell: |
$url="$(System.CollectionUri)/$(System.TeamProject)/_apis/git/repositories/$(Build.Repository.ID)/commits/$(Build.SourceVersion)/changes?api-version=5.1"
$result = Invoke-RestMethod -Uri $url -Headers @{Authorization = "Bearer $(System.AccessToken)"} -Method GET
$changesFolder = $result.changes | Where-Object{$_.item.gitObjectType -match "tree"} | Select-Object -Property {$_.item.path}
foreach($path in $changesFolder){
if($path -match '/client'){
echo "##vso[task.setvariable variable=Client;isOutput=true]$True"
break
}
}
foreach($path in $changesFolder){
if($path -match '/api'){
echo "##vso[task.setvariable variable=Api;isOutput=true]$True"
break
}
}
name: MyVariable
- job: client
pool :
vmImage: 'windows-latest'
dependsOn: getchangepath
condition: eq(dependencies.getchangepath.outputs['Myvariable.Client'], 'true')
steps:
- powershell: echo 'client job start'
- job: api
pool :
vmImage: 'windows-latest'
dependsOn: getchangepath
condition: eq(dependencies.getchangepath.outputs['Myvariable.Api'], 'true')
steps:
- powershell: echo 'api job start'
In above yml. I have three jobs. In the first job getchangepath
I call git get changes rest api in the powershell task to get the changed path which triggers the build. And output the variables if the path contains path /client
or /api
.
Job client and job api are depend on job getchangepath and will be executed on the condition of the output variable in job getchangepath.
Suppose I changed a file in folder client and commit the change to azure repo. Then after job getchangepath
is finished. MyVariable.Client
will be set to true. Then Job client will evaluate its condition and get started. Job Api will fail its condition and get skipped.
I recently faced this problem. You don't need to hard-code and access the DevOps API and PowerShell code in the solution above.
Here is a simpler solution using out of the box YAML and the workingDirectory
property per the official Azure DevOps documentation.
Setup a project structure like this, with each repository having it's own YAML file:
.
├── README.md
├── azure-pipelines.yml
├── service-a
|── azure-pipelines-a.yml
│ └── …
└── service-b
|── azure-pipelines-b.yml
└── …
You might not need a root pipeline, but if you do, you will want to ignore the sub-projects:
# Excerpt from /azure-pipeline.yml
trigger:
paths:
exclude: # Exclude!
- 'service-a/*'
- 'service-b/*'
And in the sub-projects, you want them to pay attention to themselves:
# Excerpt from /service-a/azure-pipeline-a.yml
trigger:
paths:
include: # Include!
- 'service-a/*' # or 'service-b/*'
Your sub-project pipelines are still running with the root as your working directory. You can change this using the workingDirectory
key, for example (which uses a variable to avoid repeat):
variables:
- name: working-dir
value: 'service-b/'
steps:
- script: npm install
workingDirectory: $(working-dir)
- script: npm run task
workingDirectory: $(working-dir)
If your projects share steps, you should use Pipeline Templates (in another repository) per official docs instead.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With