I have a number in base 10 which has around 10k digits. I want to convert it into base 2 (1010101001...). All I can think of is primitive algorithm:
take last digit mod 2 -> write down bit
number divide by 2;
It's shouldn't be hard to implement primary school division on string, but i'm thinking that it very inefficiente. If i'm right it will be O(l^2)
, where l
means length of number in base 10. Can that be done faster?
From what I understand you have your big number represented as a sequence of decimal digits. If that is so, you can compute a "binary" representation using multiplication and addition:
value = sum(i in 0...n-1) 10i * digiti
This computation can be split into parts in a divide and conquor way, although I'm not sure if you can arrive at a O(n log n) algorithm.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With