Source Code Note: To find the factors of another number, change the value of num . In this program, the number whose factor is to be found is stored in num , which is passed to the print_factors() function. This value is assigned to the variable x in print_factors() .
The quickest way to find the factors of a number is to divide it by the smallest prime number (bigger than 1) that goes into it evenly with no remainder.
from functools import reduce
def factors(n):
return set(reduce(list.__add__,
([i, n//i] for i in range(1, int(n**0.5) + 1) if n % i == 0)))
This will return all of the factors, very quickly, of a number n
.
Why square root as the upper limit?
sqrt(x) * sqrt(x) = x
. So if the two factors are the same, they're both the square root. If you make one factor bigger, you have to make the other factor smaller. This means that one of the two will always be less than or equal to sqrt(x)
, so you only have to search up to that point to find one of the two matching factors. You can then use x / fac1
to get fac2
.
The reduce(list.__add__, ...)
is taking the little lists of [fac1, fac2]
and joining them together in one long list.
The [i, n/i] for i in range(1, int(sqrt(n)) + 1) if n % i == 0
returns a pair of factors if the remainder when you divide n
by the smaller one is zero (it doesn't need to check the larger one too; it just gets that by dividing n
by the smaller one.)
The set(...)
on the outside is getting rid of duplicates, which only happens for perfect squares. For n = 4
, this will return 2
twice, so set
gets rid of one of them.
The solution presented by @agf is great, but one can achieve ~50% faster run time for an arbitrary odd number by checking for parity. As the factors of an odd number always are odd themselves, it is not necessary to check these when dealing with odd numbers.
I've just started solving Project Euler puzzles myself. In some problems, a divisor check is called inside two nested for
loops, and the performance of this function is thus essential.
Combining this fact with agf's excellent solution, I've ended up with this function:
from functools import reduce
from math import sqrt
def factors(n):
step = 2 if n%2 else 1
return set(reduce(list.__add__,
([i, n//i] for i in range(1, int(sqrt(n))+1, step) if n % i == 0)))
However, on small numbers (~ < 100), the extra overhead from this alteration may cause the function to take longer.
I ran some tests in order to check the speed. Below is the code used. To produce the different plots, I altered the X = range(1,100,1)
accordingly.
import timeit
from math import sqrt
from matplotlib.pyplot import plot, legend, show
def factors_1(n):
step = 2 if n%2 else 1
return set(reduce(list.__add__,
([i, n//i] for i in range(1, int(sqrt(n))+1, step) if n % i == 0)))
def factors_2(n):
return set(reduce(list.__add__,
([i, n//i] for i in range(1, int(sqrt(n)) + 1) if n % i == 0)))
X = range(1,100000,1000)
Y = []
for i in X:
f_1 = timeit.timeit('factors_1({})'.format(i), setup='from __main__ import factors_1', number=10000)
f_2 = timeit.timeit('factors_2({})'.format(i), setup='from __main__ import factors_2', number=10000)
Y.append(f_1/f_2)
plot(X,Y, label='Running time with/without parity check')
legend()
show()
X = range(1,100,1)
No significant difference here, but with bigger numbers, the advantage is obvious:
X = range(1,100000,1000) (only odd numbers)
X = range(2,100000,100) (only even numbers)
X = range(1,100000,1001) (alternating parity)
agf's answer is really quite cool. I wanted to see if I could rewrite it to avoid using reduce()
. This is what I came up with:
import itertools
flatten_iter = itertools.chain.from_iterable
def factors(n):
return set(flatten_iter((i, n//i)
for i in range(1, int(n**0.5)+1) if n % i == 0))
I also tried a version that uses tricky generator functions:
def factors(n):
return set(x for tup in ([i, n//i]
for i in range(1, int(n**0.5)+1) if n % i == 0) for x in tup)
I timed it by computing:
start = 10000000
end = start + 40000
for n in range(start, end):
factors(n)
I ran it once to let Python compile it, then ran it under the time(1) command three times and kept the best time.
Note that the itertools version is building a tuple and passing it to flatten_iter(). If I change the code to build a list instead, it slows down slightly:
I believe that the tricky generator functions version is the fastest possible in Python. But it's not really much faster than the reduce version, roughly 4% faster based on my measurements.
There is an industry-strength algorithm in SymPy called factorint:
>>> from sympy import factorint
>>> factorint(2**70 + 3**80)
{5: 2,
41: 1,
101: 1,
181: 1,
821: 1,
1597: 1,
5393: 1,
27188665321L: 1,
41030818561L: 1}
This took under a minute. It switches among a cocktail of methods. See the documentation linked above.
Given all the prime factors, all other factors can be built easily.
Note that even if the accepted answer was allowed to run for long enough (i.e. an eternity) to factor the above number, for some large numbers it will fail, such the following example. This is due to the sloppy int(n**0.5)
. For example, when n = 10000000000000079**2
, we have
>>> int(n**0.5)
10000000000000078L
Since 10000000000000079 is a prime, the accepted answer's algorithm will never find this factor. Note that it's not just an off-by-one; for larger numbers it will be off by more. For this reason it's better to avoid floating-point numbers in algorithms of this sort.
Here's an alternative to @agf's solution which implements the same algorithm in a more pythonic style:
def factors(n):
return set(
factor for i in range(1, int(n**0.5) + 1) if n % i == 0
for factor in (i, n//i)
)
This solution works in both Python 2 and Python 3 with no imports and is much more readable. I haven't tested the performance of this approach, but asymptotically it should be the same, and if performance is a serious concern, neither solution is optimal.
An alternative approach to agf's answer:
def factors(n):
result = set()
for i in range(1, int(n ** 0.5) + 1):
div, mod = divmod(n, i)
if mod == 0:
result |= {i, div}
return result
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With