I'm developing a Pylons app which is based on exisitng database, so I'm using reflection. I have an SQL file with the schema that I used to create my test database. That's why I can't simply use drop_all
and create_all
.
I would like to write some unit tests and I faced the problem of clearing the database content after each test. I just want to erase all the data but leave the tables intact. Is this possible?
The application uses Postgres and this is what has to be used also for the tests.
Delete table elements in SQLAlchemy. Get the books table from the Metadata object initialized while connecting to the database. Pass the delete query to the execute() function and get all the results using fetchall() function.
You delete everything in the database using the db. drop_all() function to add the tags and post_tag tables safely and to avoid any of the common issues related to adding new tables to a database. Then you create all the tables anew using the db. create_all() function.
flush() communicates a series of operations to the database (insert, update, delete). The database maintains them as pending operations in a transaction.
Typically when you query the database, the data get loaded at once; however, lazy parameter allows you to alternate the way they get loaded. lazy = 'select' (or True)
I asked about the same thing on the SQLAlchemy Google group, and I got a recipe that appears to work well (all my tables are emptied). See the thread for reference.
My code (excerpt) looks like this:
import contextlib from sqlalchemy import MetaData meta = MetaData() with contextlib.closing(engine.connect()) as con: trans = con.begin() for table in reversed(meta.sorted_tables): con.execute(table.delete()) trans.commit()
Edit: I modified the code to delete tables in reverse order; supposedly this should ensure that children are deleted before parents.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With