Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

JSON stringify and PostgreSQL bigint compliance

I am trying to add BigInt support within my library, and ran into an issue with JSON.stringify.

The nature of the library permits not to worry about type ambiguity and de-serialization, as everything that's serialized goes into the server, and never needs any de-serialization.

I initially came up with the following simplified approach, just to counteract Node.js throwing TypeError: Do not know how to serialize a BigInt at me:

// Does JSON.stringify, with support for BigInt:
function toJson(data) {
    return JSON.stringify(data, (_, v) => typeof v === 'bigint' ? v.toString() : v);
}

But since it converts each BigInt into a string, each value ends up wrapped into double quotes.

Is there any work-around, perhaps some trick within Node.js formatting utilities, to produce a result from JSON.stringify where each BigInt would be formatted as an open value? This is what PostgreSQL understands and supports, and so I'm looking for a way to generate JSON with BigInt that's compliant with PostgreSQL.

Example

const obj = {
    value: 123n
};

console.log(toJson(obj));

// This is what I'm getting: {"value":"123"}
// This is what I want: {"value":123}

Obviously, I cannot just convert BigInt into number, as I would be losing information then. And rewriting the entire JSON.stringify for this probably would be too complicated.

UPDATE

At this point I have reviewed and played with several polyfills, like these ones:

  • polyfill-1
  • polyfill-2

But they all seem like an awkward solution, to bring in so much code, and then modify for BigInt support. I am hoping to find something more elegant.

like image 270
vitaly-t Avatar asked Oct 05 '19 15:10

vitaly-t


Video Answer


1 Answers

Solution that I ended up with...

Inject full 123n numbers, and then un-quote those with the help of RegEx:

function toJson(data) {
    return JSON.stringify(data, (_, v) => typeof v === 'bigint' ? `${v}n` : v)
        .replace(/"(-?\d+)n"/g, (_, a) => a);
}

It does exactly what's needed, and it is fast. The only downside is that if you have in your data a value set to a 123n-like string, it will become an open number, but you can easily obfuscate it above, into something like ${^123^}, or 123-bigint, the algorithm allows it easily.

As per the question, the operation is not meant to be reversible, so if you use JSON.parse on the result, those will be number-s, losing anything that's between 2^53 and 2^64 - 1, as expected.

Whoever said it was impossible - huh? :)

UPDATE-1

For compatibility with JSON.stringify, undefined must result in undefined. And within the actual pg-promise implementation I am now using "123#bigint" pattern, to make an accidental match way less likely.

And so here's the final code from there:

 function toJson(data) {
    if (data !== undefined) {
        return JSON.stringify(data, (_, v) => typeof v === 'bigint' ? `${v}#bigint` : v)
            .replace(/"(-?\d+)#bigint"/g, (_, a) => a);
    }
}

UPDATE-2

Going through the comments below, you can make it safe, by counting the number of replacements to match that of BigInt injections, and throwing error when there is a mismatch:

function toJson(data) {
    if (data !== undefined) {
        let intCount = 0, repCount = 0;
        const json = JSON.stringify(data, (_, v) => {
            if (typeof v === 'bigint') {
                intCount++;
                return `${v}#bigint`;
            }
            return v;
        });
        const res = json.replace(/"(-?\d+)#bigint"/g, (_, a) => {
            repCount++;
            return a;
        });
        if (repCount > intCount) {
            // You have a string somewhere that looks like "123#bigint";
            throw new Error(`BigInt serialization conflict with a string value.`);
        }
        return res;
    }
}

though I personally think it is an overkill, and the approach within UPDATE-1 is quite good enough.

like image 107
vitaly-t Avatar answered Nov 15 '22 04:11

vitaly-t