For an assignment, I was supposed to write an algorithm which permits me to use at most n + O(log(n)) extra bits of memory (the details about what the algorithm was actually supposed to do isn't important here), where n is the size of an input array.
I submitted an algorithm that passes all of the test cases; however, my grader says that I am using more than n + O(log(n)) bits of memory. Their reason is that, as a part of my algorithm, I am adding the quantity (n * i) to every element in the array (where i = 1, 2, 3, ... n. i is an index variable in a loop). They are saying that for very large values of n, I will be using more memory to store the large numbers.
This leads me to the following question: is it true that my space complexity exceeds n + O(log(n)) bits by adding n * i to every number? My experience with algorithm analysis is quite limited, but I have personally never seen storing large numbers as a justification for increase in space complexity. But let's say for argument that it does increase complexity -- would I be using more than n + O(log(n)) bits?
I would like to formulate an argument for a challenge, but I just want to make sure that I am in the right before doing so.
Let b1 be the number of bits for each number before adding (i*n) to and b2 be the number after that.
Inequality (1):
b2-b1 <= log(n*n) = 2log(n)
Proof (1):
Lemma 1: binary number is the best coding scheme for integers in memory.
Lemma 2: The sum of 2 integers always has the result shorter than the sum of each number's sizes.
From Inequality (1),
In the extreme case, if b1 -> 0, then b2 = 2log(n) so the increase in space is 2nlog(n). The total space would be C + O(nlog(n))
Disclaimer: that's not a proof for your problem, because I don't know exactly how many bits you used for each number at the beginning.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With