Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to convert a 128-bit integer to a decimal ascii string in C?

Tags:

c

string

int

ascii

I'm trying to convert a 128-bit unsigned integer stored as an array of 4 unsigned ints to the decimal string representation in C:

unsigned int src[] = { 0x12345678, 0x90abcdef, 0xfedcba90, 0x8765421 };
printf("%s", some_func(src)); // gives "53072739890371098123344"

(The input and output examples above are completely fictional; I have no idea what that input would produce.)

If I was going to hex, binary or octal, this would be a simple matter of masks and bit shifts to peel of the least significant characters. However, it seems to me that I need to do base-10 division. Unfortunately, I can't remember how to do that across multiple ints, and the system I'm using doesn't support data types larger than 32-bits, so using a 128-bit type is not possible. Using a different language is also out, and I'd rather avoid a big number library just for this one operation.

like image 309
esilver Avatar asked Nov 05 '11 21:11

esilver


People also ask

Is there a 128-bit integer?

In computer architecture, 128-bit integers, memory addresses, or other data units are those that are 128 bits (16 octets) wide. Also, 128-bit central processing unit (CPU) and arithmetic logic unit (ALU) architectures are those that are based on registers, address buses, or data buses of that size.

How many values can be represented by 128 bits?

The 128-bit data type can handle up to 31 significant digits (compared to 17 handled by the 64-bit long double).

How can a number be converted to a string in C?

Solution: Use sprintf() function. You can also write your own function using ASCII values of numbers.


1 Answers

Division is not necessary:

#include <string.h>
#include <stdio.h>

typedef unsigned long uint32;

/* N[0] - contains least significant bits, N[3] - most significant */
char* Bin128ToDec(const uint32 N[4])
{
  // log10(x) = log2(x) / log2(10) ~= log2(x) / 3.322
  static char s[128 / 3 + 1 + 1];
  uint32 n[4];
  char* p = s;
  int i;

  memset(s, '0', sizeof(s) - 1);
  s[sizeof(s) - 1] = '\0';

  memcpy(n, N, sizeof(n));

  for (i = 0; i < 128; i++)
  {
    int j, carry;

    carry = (n[3] >= 0x80000000);
    // Shift n[] left, doubling it
    n[3] = ((n[3] << 1) & 0xFFFFFFFF) + (n[2] >= 0x80000000);
    n[2] = ((n[2] << 1) & 0xFFFFFFFF) + (n[1] >= 0x80000000);
    n[1] = ((n[1] << 1) & 0xFFFFFFFF) + (n[0] >= 0x80000000);
    n[0] = ((n[0] << 1) & 0xFFFFFFFF);

    // Add s[] to itself in decimal, doubling it
    for (j = sizeof(s) - 2; j >= 0; j--)
    {
      s[j] += s[j] - '0' + carry;

      carry = (s[j] > '9');

      if (carry)
      {
        s[j] -= 10;
      }
    }
  }

  while ((p[0] == '0') && (p < &s[sizeof(s) - 2]))
  {
    p++;
  }

  return p;
}

int main(void)
{
  static const uint32 testData[][4] =
  {
    { 0, 0, 0, 0 },
    { 1048576, 0, 0, 0 },
    { 0xFFFFFFFF, 0, 0, 0 },
    { 0, 1, 0, 0 },
    { 0x12345678, 0x90abcdef, 0xfedcba90, 0x8765421 }
  };
  printf("%s\n", Bin128ToDec(testData[0]));
  printf("%s\n", Bin128ToDec(testData[1]));
  printf("%s\n", Bin128ToDec(testData[2]));
  printf("%s\n", Bin128ToDec(testData[3]));
  printf("%s\n", Bin128ToDec(testData[4]));
  return 0;
}

Output:

0
1048576
4294967295
4294967296
11248221411398543556294285637029484152
like image 73
Alexey Frunze Avatar answered Oct 24 '22 21:10

Alexey Frunze