In computing the Chinese Remainder theorem from a vector of tuples (residue, modulus) the following code fails :
c = ((1,5),(3,7),(11,13),(19,23))
def crt(c):
residues, moduli = zip(*c)
N = product(moduli)
complements = (N/ni for ni in moduli)
scaled_residues = (product(pair) for pair in zip(residues,complements))
inverses = (modular_inverse(*pair) for pair in zip(complements,moduli))
si = (product(u) for u in zip(scaled_residues,inverses))
result = sum(si) % N
return result
Giving the result as 0 ( I guess the generated iterables are empty ). Yet the following code works perfectly :
def crt(c):
residues, moduli = zip(*c)
N = product(moduli)
complements = list((N/ni for ni in moduli)) # <-- listed
scaled_residues = (product(pair) for pair in zip(residues,complements))
inverses = (modular_inverse(*pair) for pair in zip(complements,moduli))
si = (product(u) for u in zip(scaled_residues,inverses))
result = sum(si) % N
return result
Which yields (a) correct result of 8851.
Why should I have to list(
one of the first generators? Adding list
to any subsequent generator does not change the fail (0) result. Only listing this first generator produces the correct result. What is going on here ?
A Python generator is a function that produces a sequence of results. It works by maintaining its local state, so that the function can resume again exactly where it left off when called subsequent times. Thus, you can think of a generator as something like a powerful iterator.
Generators are memory efficient for a large number of sequences. The normal function returns a sequence of the list which creates an entire sequence in memory before returning the result, but the generator function calculates the value and pause their execution.
Python provides a generator to create your own iterator function. A generator is a special type of function which does not return a single value, instead, it returns an iterator object with a sequence of values. In a generator function, a yield statement is used rather than a return statement.
Python generators are objects that can be looped over similar to that of a list. Unlike lists, lazy iterator contents are not stored in the memory. The efficient way to iterate through large datasets is through the use of generators.
You iterate twice over complements
. You can only iterate once over a generator expression.
If you are on Python 2.x, zip(residues,complements)
will consume complements
and there is nothing left for zip(complements,moduli)
. On Python 3.x zip
is a generator itself and the problem appears later in the code, when sum()
actually runs the generators. It would pull two items from complements
for each iteration.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With