I use Jenkins a lot and lots of different jobs run on the cluster. Every now and then a job fails with some obscure error and I've been able to debug this to the fact that job A is using an artifact X and a job B has just downloaded or a built a fresh version of it (with maven).
The job A fails with java.lang.NoClassDefFoundError or something else with ZIP file being corrupt or segfault from some ZIP implementation that Java is using.
So the obvious solution would to use private local maven repository for the jobs but that is quite disk and internet expensive solution. This would mean each job's maven would download the whole internet.
For some cases a eager loading of class files from the JAR file that corrupts often helps us limit the damage but don't want to do that for everything.
Any other solutions? Some sort of improved locking for the repository?
Yes, looks like maven doesn't like concurrent access to a single local repository (there is an issue opened recently, just 7 years ago). Creating repo for each job could be quite massive. However, there is a variable $EXECUTOR_NUMBER
, so maybe you could use it to create maven local repo per each Jenkins executor.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With