I have a model with a field is_deleted, now I want all forms of query for this model to always filter by is_deleted=False in addition to whatever filtering arguments is passed to .filter and .filter_by.
In Django, I would normally override the manager and add my own filtering but I need help for SQLAlchemy.
UPDATE:
I ended-up doing the following:
class CustomQuery(Query): def __new__(cls, *args, **kwargs): if args and hasattr(args[0][0], "is_deleted"): return Query(*args, **kwargs).filter_by(is_deleted=False) else: return object.__new__(cls) session = scoped_session(sessionmaker(query_cls=CustomQuery))
It works but if I have more fields later on I imagine I'll have to add more condition, there must be a way to do this on the model level.
This is a very old question so I'm sure the OP solved their issue, but as it remains unanswered (in 2021) here's how we've approached applying a custom query class to all models:
class CustomQuery(Query): ...
class BaseModel(Model): __abstract__ = True query_class = CustomQuery ...
Then any models implementing the BaseModel will obviously inherit this behaviour:
class MyModel(BaseModel): __tablename__ = 'my_model' ....
Note, in our case not all of the tables follow the soft delete pattern (we don't care about the history of every single table). Here, you could implement a separate BaseModel that uses the default query class.
class ImmutableBaseModel(Model): __abstract__ = True query_class = CustomQuery ... class MutableBaseModel(Model): __abstract__ = True
If you find yourself here and you've not read it yet check out this excellent blog post from Miguel Grinberg on implementing the soft delete pattern and accompanying repo
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With