I tried to do a sorting in sqlalchemy query, the parameters come from 'query_sort' which contains a list of sort parameter (field and direction).
here is the code
def select_all(self, query_paging, query_sort):
""" method to select all the transport type"""
try:
select_all_query =\
self._session.query(TransportType)
for s in query_sort:
select_all_query =\
>> select_all_query.order_by(s.dir(getattr(TransportType, s.field)))\
.limit(query_paging.page_size)\
.offset(query_paging.skip)\
.all()
return select_all_query
except NoResultFound:
return None
Then in py.test, I tried to test this program by using this code :
s1 = sort
s1.field = "type"
s1.dir = asc
s2 = sort
s2.field = "transport_type_id"
s2.dir = asc
query_sort = [s1,s2]
query_paging.skip = 1
query_paging.page_size = 10
transport_types = repo.select_all(query_paging, query_sort)
assert len(transport_types) == 1
when I ran the test, I got this error :
E AttributeError: 'list' object has no attribute 'order_by'
It worked fine when I was only using one data (s1) but when I try to test it using more than one data then it produces this error.
You called .all()
on your query, which returns all results of the query in a list:
select_all_query =\
select_all_query.order_by(s.dir(getattr(TransportType, s.field)))\
.limit(query_paging.page_size)\
.offset(query_paging.skip)\
.all()
so the next iteration of the loop select_all_query
is now a list.
If you need to apply different orderings, do just that in the loop:
select_all_query = self._session.query(TransportType)
for s in query_sort:
select_all_query = select_all_query.order_by(
s.dir(getattr(TransportType, s.field)))
select_all_query = (
select_all_query.limit(query_paging.page_size)
.offset(query_paging.skip)
.all())
return select_all_query
My solution was to change the all().order_by()
to filter().order_by()
To order post by recent date posted I did.
posts = Post.query.filter().order_by('date_posted desc')
Hope that helps.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With