I'm using a heap queue to implement an algorithm, when i add new nodes to my queue, they're sorted by a heuristic function: eg heappush(queue, (score(node), node)), which is fantastic, apart from the fact that when I come to pop the next node out of the queue, i want the most recently added node, as opposed to the first added node, which is what heappop returns. how can I get the most recent nodes added to the queue without breaking it?
I guess I could just begin iteration at the first element, and while the next element has the same score, continue. Then when I find the final element with that score, I select it and remove it from the list. This obviously isn't very efficient though and breaks the time complexity of a priority queue?
I'm stuck on this and I can't figure out a way to do it.
Thanks.
edit; Using a counter, as suggested will not work (perhaps I've misunderstood)
>>> queue = []
>>> heappush(queue, (2, 0, 'a'))
>>> heappush(queue, (3, -1, 'b'))
>>> queue
[(2, 0, 'a'), (3, -1, 'b')]
>>> heappush(queue, (2, -2, 'c'))
>>> queue
[(2, -2, 'c'), (3, -1, 'b'), (2, 0, 'a')]
Now the queue is ordered incorrectly, and 'b' which is a worse option than 'a' is placed before it.
EDIT 2:
What the hell?
>>> heappop(queue)
(2, -2, 'c')
>>> queue
[(2, 0, 'a'), (3, -1, 'b')]
>>>
One very simple way is to add a middle term containing a timestamp or a counter value. So for example:
>>> heap = []
>>> counter = itertools.count()
>>> for i in range(10):
... heapq.heappush(heap, (i, -next(counter), str(i) + ' oldest'))
... heapq.heappush(heap, (i, -next(counter), str(i) + ' newest'))
...
>>> heapq.heappop(heap)
(0, -1, '0 newest')
>>> heapq.heappop(heap)
(0, 0, '0 oldest')
>>> heapq.heappop(heap)
(1, -3, '1 newest')
>>> heapq.heappop(heap)
(1, -2, '1 oldest')
As agf mentions, this is the approach used by the heapq
docs, only using negative indices. (The implementation there also includes a nice task-removal and priority-change routine.)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With