Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to filter JSON data in node.js? [duplicate]

I have seen very old answers to this question and many of the technologies used 2 years back have changed.

What I have is JSON files sent by a database over to my server, and what I would like to know is how to filter that data.

I am running a server with node.js, and what I would like to do is something like:

var results = QueryLibrary.load(jsondata);
var filtered = results.query('select where user = "user1"');

How can I do something like this in javascript running in node?

like image 534
Bryan Arbelo - MaG3Stican Avatar asked Aug 26 '14 20:08

Bryan Arbelo - MaG3Stican


2 Answers

You can use any of the normal array/object built-in functions that javascript has, normally that kind of query would be made at the time of retrieving your data from the database, not after.

something like

for(i=0;i<objIdsArray.length;i++) {
    for(j=0;j<mockJSON.length;j++) {        
        if(mockJSON[j]["id"] === parseInt(objIdsArray[i])) {                
            mockJSON.splice(j, 1); // to delete it, could be any other instruction
        }
    }
}
like image 128
Santiago Rebella Avatar answered Sep 20 '22 16:09

Santiago Rebella


underscore has a where function that does just this

var _ = require("underscore");

var json = '[{"user": "a", "age": 20}, {"user": "b", "age": 30}, {"user": "c", "age": 40}]';

var users = JSON.parse(json);

var filtered = _.where(users, {user: "a"});

// => [{user: "a", age: 20}]

Another utility library, Lo-Dash, has a where function that operates identically.


You can add underscore to your project using

$ npm install --save underscore

or lodash

$ npm install --save lodash

If you only care about the where function, lodash offers it as a separate module

// only install lodash.where
$ npm install --save lodash.where

To use it in your project

var where = require("lodash.where");

// ...
var filtered = where(users, {"user": "a"});

Even if you use a library to do this, a better approach is probably to setup a chain of streams that handles all of your data processing in smaller modules.

Without knowing what you actually want to do, I've created this as an example. For the purposes of this code, maybe think of a debug logging stream or something.

json-parser.js

input: string (JSON)
output: object

var Transform = require("stream").Transform;

function JsonParser() {
  Transform.call(this, {objectMode: true});
  this._transform = function _transform(json, enc, done) {
    try {
      this.push(JSON.parse(json));
    }
    catch (e) {
      return done(e);
    }
    done();
  }
}

JsonParser.prototype = Object.create(Transform.prototype, {
  constructor: {
    value: JsonParser
  }
});

module.exports = JsonParser;

obj-filter.js

input: object
output: object (result of where(data, filters))

var Transform = require("stream").Transform;
var where = require("lodash.where");

function ObjFilter(filters) {
  Transform.call(this, {objectMode: true});
  this._transform = function _transform(obj, enc, done) {
    this.push(where(obj, filters));
    done();
  }
}

ObjFilter.prototype = Object.create(Transform.prototype, {
  constructor: {
    value: ObjFilter
  }
});

module.exports = ObjFilter;

stringifier.js

input: object
output: string (JSON)

var Transform = require("stream").Transform;

function Stringifier() {
  Transform.call(this, {objectMode: true});
  this._transform = function _transform(obj, enc, done) {
    this.push(JSON.stringify(obj));
    done();
  }
}

Stringifier.prototype = Object.create(Transform.prototype, {
  constructor: {
    value: Stringifier
  }
});

module.exports = Stringifier;

app.js

// modules
var JsonParser = require("json-parser");
var ObjFilter = require("obj-filter");
var Stringifier = require("stringifier");

// run
var parser = new JsonParser();

// setup stream chain
parser.pipe(new ObjFilter({"user": "a"}))
      .pipe(new Stringifier())
      .pipe(process.stdout);

// send example json in
parser.write('[{"user": "a", "age": 20}, {"user": "b", "age": 30}, {"user": "c", "age": 40}]');

// output
// => [{"user":"a","age":20}]

Here, I made a Stringifier stream that converts objects back into JSON so that we can see them dumped into the console, though you could easily create any streams you needed to handle the operations that your app requires. Your stream end points will likely not end up in writing to the console.

As a last note, you would probably create a database stream that accepts some sort of query options and emits json. You would pipe that stream directly into parser.

Anyway, I hope this gives you a better idea of how to process data in node.js.

like image 36
Mulan Avatar answered Sep 20 '22 16:09

Mulan