Is is possible to count the distinct digits in a number in constant time O(1)
?
Suppose n=1519
output should be 3
as there are 3
distinct digits(1,5,9)
.
I have done it in O(N)
time but anyone knows how to find it in O(1)
time?
There are mainly 10 digits in mathematics as mentioned below: 0, 1, 2, 3, 4, 5, 6, 7, 8, 9. A group of digits is considered as a number. We can form distinct numbers using these 10 digits.
How Many 5-Digit Numbers are there? There are 90,000 five-digit numbers including the smallest five-digit number which is 10,000 and the largest five-digit number which is 99,999.
There are, you see, 3 x 2 x 1 = 6 possible ways of arranging the three digits. Therefore in that set of 720 possibilities, each unique combination of three digits is represented 6 times. So we just divide by 6. 720 / 6 = 120.
Number of natural numbers between 1000 and 9999 are (9999 – 1000 + 1) = 9000. Now, we will find all the 4 digits natural numbers with all unique digits. The first place of the number cannot have 0, therefore, the number of possibilities of the first place is 9.
I assume N
is the number of digits of n
. If the size of n
is unlimited, it can't be done in general in O(1) time.
Consider the number n=11111...111
, with 2 trillion digits. If I switch one of the digits from a 1
to a 2
, there is no way to discover this without in some way looking at every single digit. Thus processing a number with 2 trillion digits must take (of the order of) 2 trillion operations at least, and in general, a number with N
digits must take (of the order of) N
operations at least.
However, for almost all numbers, the simple O(N)
algorithm finishes very quickly because you can just stop as soon as you get to 10 distinct digits. Almost all numbers of sufficient length will have all 10 digits: e.g. the probability of not terminating with the answer '10' after looking at the first 100 digits is about 0.00027, and after the first 1000 digits it's about 1.7e-45. But unfortunately, there are some oddities which make the worst case O(N)
.
After seeing that someone really posted a serious answer to this question, I'd rather repeat my own cheat here, which is a special case of the answer described by @SimonNickerson:
O(1) is not possible, unless you are on radix 2, because that way, every number other than 0 has both 1 and 0, and thus my "solution" works not only for integers...
EDIT
How about 2^k - 1? Isn't that all 1s?
Drat! True... I should have known that when something seems so easy, it is flawed somehow... If I got the all 0 case covered, I should have covered the all 1 case too.
Luckily this case can be tested quite quickly (if addition and bitwise AND are considered an O(1) operation): if x is the number to be tested, compute y this way: y=(x+1) AND x
. If y=0
, then x=2^k - 1
. because this is the only case when all the bits needed to be flipped by the addition. Of course, this is quite a bit flawed, as with bit lengths exceeding the bus width, the bitwise operators are not O(1) anymore, but rather O(N).
At the same time, I think it can be brought down to O(logN), by breaking the number into bus width size chunks, and AND-ing together the neighboring ones, repeating until only one is left: if there were no 0s in the number tested, the last one will be full 1s too...
EDIT2: I was wrong... This is still O(N).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With