Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to share file(s) between YAML Stages in Azure DevOps

I am trying to deploy my node.js code to Azure Function App using Azure DevOps. I have created the following Azure DevOps pipeline using YAML.

The problem I am facing is that on the deploy step, my pipeline is failing because it is not able to find the package. When I look at the logs, I believe during the clean up activity between jobs/stages, the folders are cleaned. I have tried using other predefined variables like Build.ArtifactStagingDirectory but none works.

trigger:
  - master

variables:
  azureServiceConnection: 'mySvcCon'
  azureFuncApp: myFApp

stages:
  - stage: Build_1
    displayName: 'Build Stage'
    jobs:
      - job: build
        displayName: 'Build Node.js app'
        pool:
          vmImage: 'Ubuntu-16.04'

        steps:
          - task: NodeTool@0
            displayName: 'Install Node.js'
            inputs:
              versionSpec: '8.x'

          - script: |
              npm install
            displayName: 'npm install and build'

          - task: CopyFiles@2
            displayName: 'Copy necessary files'
            inputs:
              SourceFolder: '$(System.DefaultWorkingDirectory)'
              Contents: |
                **/*
                !.vscode/**/*
              TargetFolder: '$(System.DefaultWorkingDirectory)/copied'

          - task: PublishBuildArtifacts@1
            displayName: 'Publish artifact'
            enabled: true
            inputs:
              PathtoPublish: '$(Build.ArtifactStagingDirectory)/copied'
              publishLocation: filePath
              targetPath: '$(System.DefaultWorkingDirectory)/publish'

  - stage: Deploy_2
    displayName: 'Deploy Stage'
    jobs:
      - job: Deploy
        displayName: 'Deploy to Function App'
        pool:
          vmImage: 'Ubuntu-16.04'

        steps:
          - task: AzureRMWebAppDeployment@4
            displayName: 'AzureRM Function App deploy'
            inputs:
              ConnectionType: 'AzureRM'
              ConnectedServiceName: $(azureServiceConnection)
              WebAppKind: 'Function App'
              WebAppName: $(azureFuncApp)
              Package: '$(System.DefaultWorkingDirectory)/publish'

How do I share my artifact between the stages? The same pipeline works if I put all steps in the same job. But I want to separate them out.

like image 844
TechiRik Avatar asked Apr 15 '19 17:04

TechiRik


People also ask

How do you pass variables from one stage to another in Azure DevOps?

Passing variables between tasks in the same job Set the value with the command echo "##vso[task. setvariable variable=FOO]some value" In subsequent tasks, you can use the $(FOO) syntax to have Azure Pipelines replace the variable with some value.

What are stages in YAML file?

The concept of stages varies depending on whether you use YAML pipelines or classic release pipelines. You can organize pipeline jobs into stages. Stages are the major divisions in a pipeline: "build this app", "run these tests", and "deploy to pre-production" are good examples of stages.


2 Answers

In general - creating artifacts is typically done by Build Pipeline while deploying artifacts is done in a Release Pipeline. There is definitely the opportunity to perform both actions in a single build pipeline depending on your usage. Combining especially makes sense when you are just getting started with Azure Pipelines as the ecosystem can be overwhelming with the amount of capabilities available. There is publicized work on merging the release capabilities into the build capabilities to simplify onboarding.

Separating the pipelines does give you the benefit of retrying deploy if the deployment failed the first time - it really depends how quick your build time is. Also supporting deployment of the same bits across environments is easier if you want to manually trigger environment or ringed release propagation. The list for separating build/deploy grows exponentially once you dig into some of the power-user features of release stages.

For your approach to work - you could leverage the dependsOn YAML element to link the subsequent jobs to have an output dependency.

Build Pipeline - Dependency Chaining

jobs:
- job: InitialA
  steps:
  - script: echo hello from initial A
- job: InitialB
  steps:
  - script: echo hello from initial B
- job: Subsequent
  dependsOn:
  - InitialA
  - InitialB
  steps:
  - script: echo hello from subsequent

Update 11/15/19

Devops recently released download task to consume files across CI/CD boundaries. Pipeline Artifacts can also be used to share files across stages now.

like image 23
SliverNinja - MSFT Avatar answered Oct 25 '22 22:10

SliverNinja - MSFT


As mentioned in the answer and in some comments, it is now possible to download previously published artifacts.

In the code below, I am publishing the scripts folder as an artifact named dropScripts, which I have on the root of my solution. This way I can use the scripts contained in that folder in later stages of the pipeline. In another stage, I download the dropScripts artifact, and then I run a powershell script (script20.ps1), which was contained in the scripts folder.

stages:
- stage: 'Build'
  jobs: 
  - job: 'Build'
    pool:
      vmImage: 'ubuntu-16.04'
    steps:

    (...)

    - task: CopyFiles@2
      displayName: 'Copy powershell scripts for later use in the pipeline'
      inputs:
        contents: 'scripts/**'
        targetFolder: '$(Build.ArtifactStagingDirectory)'
    - publish: '$(Build.ArtifactStagingDirectory)/scripts'
      displayName: 'Publish powershell scripts for later use'
      artifact: dropScripts

- stage: DoSomethingWithTheScriptsStage
  dependsOn: Build
  jobs: 
  - job: DoSomethingWithTheScriptsJob
    pool: 
      vmImage: 'windows-2019'
    steps:
      - download: current
        artifact: dropScripts
      - task: PowerShell@2
        inputs:
          filePath: '$(Pipeline.Workspace)\dropScripts\script20.ps1'
like image 170
ccoutinho Avatar answered Oct 25 '22 22:10

ccoutinho