I am building a Pyramid web application which is built on the top of SQLAlchemy and solely relies PostgreSQL as its database backend.
What would be a way to have the unit tests structure so that
To speed up tests, database transactions are rolled back at the teardown(), or other clean up hook of the test suite
Other tricks to speed up tests could be used, e.g. if SQLAlchemy and PostgreSQL has anything corresponding SQLite's :in:memory: database
It is possible to choose a custom test runner á la py.test if a specific features outside the standard library unittest framework makes it easier to write test cases.
Since this question is a bit similar to your other question, I'll copy the relevant part of my answer here:
All test case classes should be subclassed from a base class, which defines a common tearDown method:
class BaseTest(unittest.TestCase):
def setUp(self):
# This makes things nicer if the previous test fails
# - without this all subsequent tests fail
self.tearDown()
self.config = testing.setUp()
def tearDown(self):
testing.tearDown()
session.expunge_all()
session.rollback()
For tests you can initialize PostgreSQL data directory on a RAMdisk (you can create a directory under /dev/shm/, which is mounted as tmpfs in modern Linux distributions). The you can run Postgres on non-standard port.
PGTEMP=`mktemp -d /dev/shm/pgtemp.XXXXXX`
initdb -D "$PGTEMP"
postgres -D "$PGTEMP" -k "$PGTEMP" -p 54321
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With