I'm doing Elgamal Signature Scheme and I need to use the decimal hash value from the message to compute S for signature generation. An example of the hexadecimal hash is:
820dbb4256a4287557ade2f729d279f1
As you can see above, the hash value is a 32-digits hexadecimal number. I need to transform the string above to decimal integer and use it for calculation later.
string hash = md5(message);
cout << hash << endl;
NTL::ZZ msgHash = strtol(hash.c_str(), NULL, 16);
cout << msgHash << endl;
There are no integer large enough to contain the value of 32 byte hexadecimal hash, and so I tried big integer from NTL library but it didn't work out because you cannot assign the long integer returned from strtol function (And I think the decimal hash value is way longer than long integer range limit) to NTL::ZZ type. Is there any good solution to this?
I'm doing this with visual C++ in Visual Studio 2013.
The result of MD5 is an octet string (or byte array). The hexadecimal value should just be used to represent the binary value. You need to convert your hexadecimal string back to bytes. This is performed by a hexadecimal decoder found in oodles of libraries.
Then it is required to convert the byte array back to a multi-precision integer. For this you need a libary. Within crypto (which is often based on group theory) it probably makes sense to interpret the bytes as an unsigned, big endian (MSB first) encoded integer.
If it is required to perform any additional cryptography-related calculations then it is a good idea to use a cryptographic library like OpenSSL (C), Crypto++ or Bothan (C++). These libraries already contain methods of handling large (unsigned) numbers and generally hexadecimal codecs as well.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With