Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Simulating C# overflow in NodeJS

Tags:

c#

.net

node.js

I am trying to translate C# code to nodejs and I've hit a wall. One of the functions in C# uses a bytes to generate 3 numbers using BitConverter.toInt64 like so:

    var hashText = //Generates Hash from an input here using ComputeHash

    var hashCodeStart = BitConverter.ToInt64(hashText, 0);  
    var hashCodeMedium = BitConverter.ToInt64(hashText, 8);
    var hashCodeEnd = BitConverter.ToInt64(hashText, 24);


 //Does other stuff with the three pieces here

As an example, if I use the array:

var hash = new Byte[] {0xAA, 0x9B, 0x50, 0xA7, 0x56, 0x8D, 0x2A, 0x99, 0x87, 0xA7, 0x24, 0x10, 0xF8,0x1E, 0xC3, 0xA2, 0xF9, 0x57, 0x1A, 0x2D, 0x69, 0x89, 0x83, 0x91, 0x2D, 0xFA, 0xA5, 0x4A, 0x4E, 0xA2, 0x81, 0x25};

Then the values for start, middle, and end are:

Start : -7409954833570948182

Middle: -6718492168335087737

End : 2702619708542548525

But using NodeJS with the biguinut-format package I get the following numbers (code below):

start : 12293508287479753369

middle : 9774821171531793314

end : 17966858020764353425

with the following NodeJS

    var hexed = "aa9b50a7568d2a9987a72410f81ec3a2f9571a2d698983912dfaa54a4ea28125"
    var format = require('biguint-format')

    console.log(hexed.toUpperCase().slice(0, 16))
    console.log("Start is " + format(hexed.toUpperCase().slice(0, 16), 'dec'))

    console.log(hexed.toUpperCase().slice(16, 32))
    console.log("Middle is " + format(hexed.toUpperCase().slice(16, 32), 'dec'))

    console.log(hexed.toUpperCase().slice(32, 48))
    console.log("End is " + format(hexed.toUpperCase().slice(32, 48), 'dec'))

I understand that the numbers for C# are coming out negative because of some overflow, however the problem is that the overflow seems to happen before the maximum that can be stored by int64.

Is there anyway for me to find out which number this is, or any other way for me to emulate the C# code?

like image 494
kesubagu Avatar asked Aug 05 '16 16:08

kesubagu


1 Answers

Your using a string instead of a buffer and splitting the bites so you have an array of 16 digits instead of 8 hex values I don't know if that will make a difference or if its being converted wrong try using the documented way

var buffer1 = new Buffer([0xAA, 0x9B, 0x50, 0xA7, 0x56, 0x8D, 0x2A, 0x99, 0x87]);
format(buffer1, 'dec', {format:'LE'})  

you potentially will still get an uint so you will need to convert it to a signed int afterwards

As @argaz mentions BitConverter is usually little endian, thus the need for the LE flag.

like image 111
johnny 5 Avatar answered Sep 30 '22 19:09

johnny 5