Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Computational complexity of base conversion

What is the complexity of converting a very large n-bit number to a decimal representation?

My thought is that the elementary algorithm of repeated integer division, taking the remainder to get each digit, would have O(M(n)log n) complexity, where M(n) is the complexity of the multiplication algorithm; however, the division is not between 2 n-bit numbers but rather 1 n-bit number and a small constant number, so it seems to me the complexity could be smaller.

like image 487
Tanner Avatar asked Feb 09 '15 20:02

Tanner


1 Answers

Naive base-conversion as you described takes quadratic time; you do about n bigint-by-smallint divisions, most of which take time linear in the size of the n-bit bigint.

You can do base conversion in O(M(n) log(n)) time, however, by picking a power of target-base that's roughly the square root of the to-be-converted number, doing divide-and-remainder by it (which can be done in O(M(n)) time via Newton's method), and recursing on the two halves.

like image 82
tmyklebu Avatar answered Oct 06 '22 01:10

tmyklebu