Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why is JSON faster than BSON in node.js?

After reading this, there is a quote that stood out:

BSON is also designed to be fast to encode and decode. For example, integers are stored as 32 (or 64) bit integers, so they don't need to be parsed to and from text. This uses more space than JSON for small integers, but is much faster to parse.

From what I am reading, the entire point of using BSON is because it is less taxing on the CPU and faster to encode/process.

But, I did some tests with Node.js and using a native JSON approach blows BSON out of the water. Some tests are showing JSON is around 3 to 5 times faster. (And around 6 to 8 when using more data types.)

Benchmark Code:

var bson = require('bson');
var BSON = new bson.BSONPure.BSON();

var os = require('os');

console.log(" OS: " + os.type() + " " + os.release() + " (" + os.arch() + ")");
console.log("RAM: " + os.totalmem() / 1048576 + " MB (total), " + os.freemem() / 1048576 + " MB (free)");
console.log("CPU: " + os.cpus()[0].speed + " MHz " + os.cpus()[0].model);

for (var r = 1; r < 4; r++) {
    console.log("\nRun #" + r + ":");
    var obj = {
        'abcdef': 1,
        'qqq': 13,
        '19': [1, 2, 3, 4]
    };

    var start = Date.now();
    for (var i = 0; i < 500000; i++) {
        JSON.parse(JSON.stringify(obj));
    }
    var stop = Date.now();
    console.log("\t      JSON: " + (stop - start) + " ms");

    start = Date.now();
    for (var i = 0; i < 500000; i++) {
        BSON.deserialize(BSON.serialize(obj));
    }
    stop = Date.now();
    console.log("\t      Bson: " + (stop - start) + " ms");
}

Results:

OS: Windows_NT 6.1.7601 (x64)
RAM: 8174.1171875 MB (total), 5105.03515625 MB (free)
CPU: 3515 MHz AMD FX(tm)-6300 Six-Core Processor

Run #1:
              JSON: 1820 ms
              Bson: 8639 ms

Run #2:
              JSON: 1890 ms
              Bson: 8627 ms

Run #3:
              JSON: 1882 ms
              Bson: 8692 ms

With that said, I am looking for a binary approach to send and receive data through websockets. And BSON does this perfectly, but, when looking at the benchmark results, how can BSON be less taxing on the CPU when it takes longer to serialize / deserialize objects?

Does BSON make up for the extra CPU usage it utilizes since there will be no conversion to UTF-8 with text based websockets? Would that level out the performance in that regard?

@Joe Clay below, here is the results for stringify and serializing only:

Run #1:
              JSON: 922 ms
              Bson: 355 5ms
like image 494
NiCk Newman Avatar asked Apr 21 '16 10:04

NiCk Newman


Video Answer


1 Answers

The question should not be Why is JSON faster than BSON? but Why is JSON faster than BSON in node.js?.

In most environments binary encodings like BSON, MessagePack or CBOR would be easier to encode than the textual JSON encoding. However javascript environments (like v8/node.js) are heavily optimized for JSON handling (because it's a subset of javascript). JSON de/encoding is probably implemented there in native code in optimized fashion directly in the JS VM. The javascript VMs are however not that optimized for representing and manipulating byte arrays (which is used by a BSON library). Nodes native Buffer type might be better than a pure JS array, but working with it (and doing for example the JS string (UTF16) -> UTF8 byte decoding in JS) is still slower then the inbuilt JSON serialization.

In other languages like C++ with direct byte array access and utf8 string types the results might be completely different.

like image 105
Matthias247 Avatar answered Oct 14 '22 03:10

Matthias247