How can we configure gitlab to keep only the last 10 CI jobs/builds and keep deleting the rest?
For example , in Jenkins , we can configure the job to keep only last X builds.
By default, all job traces (logs) are saved to /var/opt/gitlab/gitlab-ci/builds and /home/git/gitlab/builds for Omnibus packages and installations from source respectively. The job logs are organized by year and month (for example, 2017_03 ), and then by project ID.
GitLab CI/CD is the part of GitLab that you use for all of the continuous methods (Continuous Integration, Delivery, and Deployment). With GitLab CI/CD, you can test, build, and publish your software with no third-party application or integration needed.
GitLab runner is a build instance which is used to run the jobs over multiple machines and send the results to GitLab and which can be placed on separate users, servers, and local machine. You can register the runner as shared or specific after installing it.
As of Gitlab Release 12.6, deleting a pipeline is now an option in the GUI for owners:
As of Gitlab Release 11.6, deleting a pipeline is now an option, for maintainers only, via the API.
You need:
id
of the projectpipeline_id
of the pipeline you wish to remove.Example using curl from the docs for project id: 1
and pipeline_id: 4
:
curl --header "PRIVATE-TOKEN: <your_access_token>" --request "DELETE" "https://gitlab.example.com/api/v4/projects/1/pipelines/46"
Documentation is here
Mass deletion script fixed for the lazy, delete X pipelines from the oldest.
Note: need jq.
#!/bin/bash set -e TOKEN="" PROJECT="" # How many to delete from the oldest. PER_PAGE=100 for PIPELINE in $(curl --header "PRIVATE-TOKEN: $TOKEN" "https://gitlab.com/api/v4/projects/$PROJECT/pipelines?per_page=$PER_PAGE&sort=asc" | jq '.[].id') ; do echo "Deleting pipeline $PIPELINE" curl --header "PRIVATE-TOKEN: $TOKEN" --request "DELETE" "https://gitlab.com/api/v4/projects/$PROJECT/pipelines/$PIPELINE" done
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With