Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Calculating large factorial time complexity

I ran across a problem where I needed to calculate the values of very large factorials. I solved this problem in C++ in two different ways, but only want to know if my complexity analysis is accurate.

In either method I am representing very large numbers as vectors where v[0] represents the least significant digit, and the value at the last index represents the most significant digit. Version 1's code can be found in this gist.

Given the above code, it seems multiplyVectorByInteger() is O(log(n*k)) where n is the given integer, and k is the number represented by the vector. My logic is that we'll be doing some number of steps proportional to the length of the resulting number n*k in order to produce a vector representing n*k. The length of n*k is O(log(n*k)) Some of the steps will be carried out in the for loop, others in the following while loop.

In this program to find large factorials, whenever we call multiplyVectorByInteger() we will be passing in an integer n and the vector representation of (n-1)!. This means if we want to find 6!, we pass in the integer 6 and the vector representation of 5!. The function will return the vector representation of 6!. Using the previous information I believe I can say the complexity is O(log(i!)) where i is the passed in integer. In order to find large factorials, we must call this method O(n) times where n is the factorial we are trying to find. Our accumulated logic will look like this:

1!       = 1!
1!*2     = 2!
2!*3     = 3!
3!*4     = 4!
...
(n-1)!*n = n!

Since at each level we're calculating i!, we're consequently performing O(log(i!)) steps at each level. The summation to show this is as follows:

sum1

My logic from jumping from the second summation to the Big-Oh notation is as follows...breaking this out we get the following:

1log(1) + 2log(2) + 3log(3) + ... + nlog(n)

It is obvious we get O(n^2) terms of log(1) + log(2) + ... + log(n). Log rules remind us that log(a) + log(b) = log(ab), which means the log terms in this case collapse to log(n!). Thus we have O(n^2)log(n!).

This would make the overall time complexity of this program O(n^2log(n!)). Is this analysis correct?

Naive version time complexity

To practice complexity analysis I want to take a look at what seems like a less efficient solution. Suppose we change our multiplyVectorByInteger() function such that instead of multiplying a vector representation of k by an integer n in O(log(n!)) time to produce n!, the new multiplyVectorByIntegerNaive() function adds the vector representation of a number together a total of n times.

multiplyVectorByIntegerNaive() exists in this gist. It utilizes a function addVectors() whose complexity O(n) where n size of the larger of the two vectors.

It's clear we're still calling this new multiplication function n times, but we need to see if the complexity has changed. For example given the integer 6 and the vector representation of 5! we add 5! + 5! + 5! + 5! + 5! + 5! to get 6*5! = 6!. If the given integer to our multiplication function is i, it is clear we do i-1 additions. We can enumerate the steps for the previous example call to our naive multiplication function.

5! + 5!
2*5! + 5!
3*5! + 5!
4*5! + 5!
5*5! + 5!

Writing out the full summation should now give:

sum2

It appears the asymptotic complexity of both methods is the same given my calculations are accurate. Is this true?

like image 566
Dominic Farolino Avatar asked Aug 23 '16 05:08

Dominic Farolino


People also ask

How do you find the time complexity of a factorial function?

To represent in Big-Oh notation, T(N) is directly proportional to n, Therefore, The time complexity of recursive factorial is O(n). As there is no extra space taken during the recursive calls,the space complexity is O(N).

What is time complexity of n factorial?

Time complexity factorial(n) is 1 comparison, 1 multiplication, 1 subtraction and time for factorial(n-1)


2 Answers

The complexity of the function in the gist you have provided is O(log10n!), where n is the number you pass to the method.

The reasoning for this is evident from the first part of the code:

for (int i = 0; i < numVector.size(); ++i) {
    returnVec[i] = (numVector[i]*n + carry) % 10;
    carry = (numVector[i]*n + carry) / 10;
}

The numVector passed in represents (n - 1)!. i.e. it contains all the digits that make up that number. However length of that number is simply ⌈log10((n-1)!)⌉. You can see this from a simple example:

if (n-1)! is 100, then the length of the numVector will be 3 which is the same as ⌈log10100⌉ = 3.

The same logic applies to the while loop too:

while (carry) {
    returnVec.push_back(carry%10);
    carry /= 10;
}

Since the value of carry will not be larger than n (you can prove this for yourself), then the maximum number of times this loop will run will also not be larger than ⌈log10n!⌉, then the total complexity of the function is equivalent to O(log10n!).

Therefore, to calculate k!, the complexity of your code (including main) will be O(klog10k!)

Naive version

For the naive version, the only thing that has changed is that now the method is manually stepping through the multiplication in the form of addition. This is something the other version skipped by explicitly multiplying each value by n

(numVector[i]*n + carry)

This increases the complexity of the function to O(klog10n!), where k! = n and thus the complexity of the entire code is now O(k2log10k!)

like image 111
smac89 Avatar answered Sep 30 '22 12:09

smac89


Multiplying a k-digits number by an integer or adding two k-digits numbers both take time proportional to k.

Hence, in the multiply version, the total workload is

Sum[i=1,n]: log(i!) ~  Sum[i=1,n]: i.log(i) ~ n²log(n)

In the add version,

Sum[i=1,n]: i.log(i!) ~ Sum[i=1,n]: i².log(i!) ~ n³.log(n)

These results can be established by using The Stirling approximation and an integral instead of a summation,

Int x.log(x) dx = x²(log(x)/2 - 1/4)
Int x².log(x) dx = x³(log(x)/3 - 1/9)

As could be expected, there is an extra n factor.

like image 21
Yves Daoust Avatar answered Sep 30 '22 14:09

Yves Daoust