Consider these two functions:
def foo():
    x = 0
    while True:
        yield x
        x += 1
def wrap_foo(limit=10, gen=True):
    fg = foo()
    count = 0
    if gen:
        while count < limit:
            yield next(fg)
            count += 1
    else:
        return [next(fg) for _ in range(limit)]=
foo() is a generator, and wrap_foo() just puts a limit on how much data gets generated. I was experimenting with having the wrapper behave as a generator with gen=True, or as a regular function that puts all generated data into memory directly with the kwarg gen=False. 
The regular generator behavior works as I'd expect:
In [1352]: [_ for _ in wrap_foo(gen=True)]
Out[1352]: [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
However, with gen=False, nothing gets generated. 
In [1351]: [num for num in wrap_foo(gen=False)]
Out[1351]: []
It seems like Python pre-classifies the function as a generator based on the presence of the yield statement (latter example works perfectly if yield is commented out).
Why is this? I would like to understand the mechanisms at play here. I'm running 3.6
It seems like Python pre-classifies the function as a generator based on the presence of the yield statement
Yes, that's exactly what happens.  wrap_foo is determined to be a generator at function definition time.  You could consider using generator expressions instead:
def wrap_foo(limit=10, gen=True):
    fg = foo()
    if gen:
        return (next(fg) for _ in range(limit))
    else:
        return [next(fg) for _ in range(limit)]
                        If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With