Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Fast integer to decimal conversion

Given an (unsigned) integer, what is the generally fastest way to convert it into a string that contains its decimal representation?

The naïve way of doing that is repeatedly dividing by 10, until you reach zero. I dislike this approach, because it

  • uses integer division, which is both slow and not available on some integrated platforms
  • requires the programmer to flip the string afterwards. This doubles the number of memory operations needed.

I thought of the following method to convert integers to decimal base. Is this a good idea? How is this done in common implementations of functions like printf ?

#include <stdint.h>

const static uint64_t i64_tab[20] = {
                     1u,
                    10u,
                   100u,
                  1000u,
                 10000u,
                100000u, /* 10^ 5 */
               1000000u,
              10000000u,
             100000000u,
            1000000000u,
           10000000000u, /* 10^10 */
          100000000000u,
         1000000000000u,
        10000000000000u,
       100000000000000u,
      1000000000000000u, /* 10^15 */
     10000000000000000u,
    100000000000000000u,
   1000000000000000000u,
  10000000000000000000u  /* 10^19 */
};

void uint64_to_string(char *out, uint64_t in) {
  int i;
  uint64_t tenpow;
  char accum;

  for (i = 19;i > 0;i--) {
    if (in >= i64_tab[i]) break;
  }

  do {
    tenpow = i64_tab[i];
    accum = '0';

    while (in >= tenpow) {
      in -= tenpow;
      accum++;
    }

    *out++ = accum;

  } while (i --> 0);

  *out = '\0';
}

const static uint32_t i32_tab[10] = {
           1u,
          10u,
         100u,
        1000u,
       10000u,
      100000u, /* 10^ 5 */
     1000000u,
    10000000u,
   100000000u,
  1000000000u, /* 10^9  */
};

void uint32_to_string(char *out, uint32_t in) {
  int i;
  uint32_t tenpow;
  char accum;

  for (i = 9;i > 0;i--)
    if (in >= i32_tab[i]) break;

  do {
    tenpow = i32_tab[i];
    accum = '0';

    while (in >= tenpow) {
      in -= tenpow;
      accum++;
    }

    *out++ = accum;

  } while (i --> 0);

  *out = '\0';
}
like image 588
fuz Avatar asked Dec 12 '25 15:12

fuz


1 Answers

The fastest approach on all but the simplest (e.g. 8-bit) microcontrollers is to use division, but reduce the number of divisions by generating several digits at once.

You will find very highly optimized code in answers to my question here. Using it in C should be a trivial edit to eliminate std::string -- there are no C++ features used in the actual conversion. The core is

while(val>=100)
{
   int pos = val % 100;
   val /= 100;
   *(short*)(c-1)=*(short*)(digit_pairs+2*pos); // or use memcpy
   c-=2;
}
while(val>0)
{
    *c--='0' + (val % 10);
    val /= 10;
}

I also provided an optimized division-free code for 8-bit micros, similar to the idea shown in the code in the question, but without loops. It ends up with a lot of code like this:

    if (val >= 80) {
        ch |= '8';
        val -= 80;
    }
    else if (val >= 40) {
        ch |= '4';
        val -= 40;
    }
    if (val >= 20) {
        ch |= '2';
        val -= 20;
    }
    if (val >= 10) {
        ch |= '1';
        val -= 10;
    }
like image 183
Ben Voigt Avatar answered Dec 16 '25 16:12

Ben Voigt