Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

python tox, creating rpm virtualenv, as part of ci pipeline, unsure about where in workflow

I'm investigating how Python applications can also use a CI pipeline, but I'm not sure how to create the standard work-flow.

Jenkins is used to do the initial repository clone, and then initiates tox. Basically this is where maven, and/or msbuild, would get dependency packages and build.... which tox does via pip, so all good here.

But now for the confusing part, the last part of the pipeline is creating and uploading packages. Devs would likely upload created packages to a local pip repository, BUT then also possibly create a deployment package. In this case it would need to be an RPM containing a virtualenv of the application. I have made one manually using rpmvenev, but regardless of how its made, how what such a step be added to a tox config? In the case if rpmvenv, it creates its own virtualenv, a self contained command so to speak.

like image 365
J. M. Becker Avatar asked Apr 09 '16 18:04

J. M. Becker


People also ask

What does tox do in Python?

tox is a command-line driven automated testing tool for Python, based on the use of virtualenv . It can be used for both manually-invoked testing from the desktop, or continuous testing within continuous integration frameworks such as Jenkins or Travis CI.

How do I run a tox INI file?

# content of: tox. ini , put in same dir as setup.py [tox] envlist = py26,py27 [testenv] deps=pytest # install pytest in the venvs commands=pytest # or 'nosetests' or ... You can also try generating a tox. ini file automatically, by running tox-quickstart and then answering a few simple questions.

What is a tox file?

What is tox? Tox is a tool that creates virtual environments, and installs the configured dependencies for those environments, for the purpose of testing a Python package (i.e. something that will be shared via PyPi, and so it only works with code that defines a setup.py ).


1 Answers

I like going with the Unix philosophy for this problem. Have a tool that does one thing incredibly well, then compose other tools together. Tox is purpose built to run your tests in a bunch of different python environments so using it to then build a deb / rpm / etc for you I feel is a bit of a misuse of that tool. It's probably easier to use tox just to run all your tests then depending on the results have another step in your pipeline deal with building a package for what was just tested.

Jenkins 2.x which is fairly recent at the time of this writing seems to be much better about building pipelines. BuildBot is going through a decent amount of development and already makes it fairly easy to build a good pipeline for this as well.

What we've done at my work is

  • Buildbot in AWS which receives push notifications from Github on PR's
  • That kicks off a docker container that pulls in the current code and runs Tox (py.test, flake8, as well as protractor and jasmine tests)
  • If the tox step comes back clean, kick off a different docker container to build a deb package
  • Push that deb package up to S3 and let Salt deal with telling those machines to update

That deb package is also just available as a build artifact, similar to what Jenkins 1.x would do. Once we're ready to go to staging, we just take that package and promote it to the staging debian repo manually. Ditto for rolling it to prod.

Tools I've found useful for all this:

  • Buildbot because it's in Python thus easier for us to work on but Jenkins would work just as well. Regardless, this is the controller for the entire pipeline
  • Docker because each build should be completely isolated from every other build
  • Tox the glorious test runner to handle all those details
  • fpm builds the package. RPM, DEB, tar.gz, whatever. Very configurable and easy to script.
  • Aptly makes it easy to manage debian repositories and in particular push them up to S3.
like image 52
Paul Avatar answered Sep 30 '22 21:09

Paul