I'm fiddling with pipelines to try and reduce the overall runtime. One of the things I'd like to do is to execute docker pull ...
at the start, so that later on, when I actually need it, it's ready for me. I'd like to fire it off as a background job, and have it survive past the end of that task.
I've tried: docker pull imgname &
It does work, but the pipeline complains with this message:
The STDIO streams did not close within 10 seconds of the exit event from process '/bin/bash'. This may indicate a child process inherited the STDIO streams and has not yet exited.
I've also tried stuff like:
docker pull imgname </dev/null &>/dev/null & disown
docker pull imgname 0>&- 1>&- 2>&- 3>&- 4>&- 5>&- 6>&- 7>&- 8>&- 9>&- &
And a few similar tricks. Nothing helps.
This isn't a big deal, but it would be convenient to know how to make this possible!
To trigger a pipeline upon the completion of another pipeline, configure a pipeline resource trigger. The following example configures a pipeline resource trigger so that a pipeline named app-ci runs after any run of the security-lib-ci pipeline completes. This example has the following two pipelines.
Share variables between Tasks across the Jobs (of the same Stage) We need to use the isOutput=true flag when you desire to use the variable in another Task located in another Job. Navigate to Stage1_Job1_Task1 and add isoutput = true flag to the Logging Command which let's us to access the value outside the Job.
Select a sequence of tasks in a build or release pipeline, open the shortcut menu, and then choose Create task group. Specify a name and description for the new task group, and the category (tab in the Add tasks panel) you want to add it to.
Update
The STDIO streams did not close within 10 seconds of the exit event from process '/bin/bash'. This may indicate a child process inherited the STDIO streams and has not yet exited
This is not an error message which didn't write into the standard error steam and fail the task.
It should be more like an prompting message which indicate some process still run and not be clean up (expected behavior).
After enable the debug mode in the build pipeline, we could see
##[debug]The task was marked as "done", but the process has not closed after 5 seconds. Treating the task as complete.
The process should still be running as the background even though the task already marked completed.
According to your description, this seems not related to docker command or azure devops side.
You just need to run a powershell script (involve docker command) in background.
For example: Run Start-Job
inside a PowerShell Task, that script starts to run using the Receive-Job
. When the task exits the script stops.
In the PowerShell task I run the following:
Start-Job -FilePath "C:\build\BGGetFromNuGet.ps1" -ArgumentList "C:\build"
More details please take a look at this link-- Start Job
If you want that script will continue to to run in the background while the task has finished. You could try to use start-process
command to launch the script. This will make sure that the launched job keeps running when the task is finished. But the job will be closed when the build is finished.
Start-Process powershell.exe -ArgumentList '-file C:\build\BGGetFromNuGet.ps1'
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With