I am using generators to perform searches in lists like this simple example:
>>> a = [1,2,3,4]
>>> (i for i, v in enumerate(a) if v == 4).next()
3
(Just to frame the example a bit, I am using very much longer lists compared to the one above, and the entries are a little bit more complicated than int
. I do it this way so the entire lists won't be traversed each time I search them)
Now if I would instead change that to i == 666
, it would return a StopIteration
because it can't find any 666
entry in a
.
How can I make it return None
instead? I could of course wrap it in a try ... except
clause, but is there a more pythonic way to do it?
A better way to end the iterations is by using . close() . In this case, the generator stopped and we left the loop without raising any exception.
It uses yield instead of return keyword. So, this will return the value against the yield keyword each time it is called. However, you need to create an iterator for this function, as shown below. The generator function cannot include the return keyword.
Generators are memory efficient since they only require memory for the one value they yield. Generators are lazy: they only yield values when explicitly asked. You can feed the output of a generator to the input of another generator to form data pipelines.
If you are using Python 2.6+ you should use the next
built-in function, not the next
method (which was replaced with __next__
in 3.x). The next
built-in takes an optional default argument to return if the iterator is exhausted, instead of raising StopIteration
:
next((i for i, v in enumerate(a) if i == 666), None)
You can chain the generator with (None,):
from itertools import chain
a = [1,2,3,4]
print chain((i for i, v in enumerate(a) if v == 6), (None,)).next()
but I think a.index(2) will not traverse the full list, when 2 is found, the search is finished. you can test this:
>>> timeit.timeit("a.index(0)", "a=range(10)")
0.19335955439601094
>>> timeit.timeit("a.index(99)", "a=range(100)")
2.1938486138533335
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With