I have a NODE.JS express server which breaks down chunks of JSON into smaller sections and routes the request via URL's for mobile devices and various other apps I am creating.
I have just moved from test data to live data, but I am finding that NODE.JS is not using the latest version of the JSON, but caching and reusing the JSON that was inplace at the server runtime.
Here is (part of) the code
var express = require('express'),
http = require('http');
forever = require('forever');
var ppm = require('./data/ppm.json');
var stations= require('./data/stations.json');
var fgwstations= require('./data/fgwstations.json');
var app = express()
.use(express.bodyParser())
.use(express.static('public'));
app.all('/', function(req, res, next) {
res.header("Access-Control-Allow-Origin", "*");
res.header("Access-Control-Allow-Headers", "X-Requested-With");
next();
});
app.get('/ppm/all', function (req, res) {
res.json(ppm);
});
app.get('/*', function (req, res) {
res.json(404, {status: 'datafeed not found please refer to documentation'});
});
http.createServer(app).listen(3000, function () {
console.log("Data Server ready at http://localhost:3000");
});
There maybe typos in the code as I have just ripped chunks out as the proper artile is quite long as its does a LOT. I keep this server running permenantly using the FOREVER function so it will run without the shell command being open
Now I suspect I could do some kind of fs.filewatch type function that would restart the server every time the JSON file was updated, but this seems a bit iffy restarting a server especially when this data will be updated every 2-3 mins and future ones even more rapidly. It only takes one person to be doing a request or an app to be requesting data during this restart to cause a problem.
Is there a way of 're-reading' the JSON file assigned to var ppm (the others are fairlry static) or is restarting the server the only way?
Any good ideas on how to read that file and do this would be greatly appreciated as I am sure someone will have a much more efficient way of doing it.
The current dev server is open source (and very much WIP) and feel free to see what it does
http://54.194.148.89:3000
In order to add data in cache as JSON, the objects need to be created according to the JSON standards provided by NCache. JsonObject is added in the cache against a unique key. This key will be used to perform further operations on the cache.
You can store JSON in redis either as a plain string in dedicated key (or member/value of a set/list) or in a hash structure. If you look at node_redis docs into Friendlier hash commands part you'll see that it gives you some useful methods for manipulating JSON based data.
require()
caches everything it loads, as it is designed for loading code modules which shouldn't be changing. If you want to reload something, you have to delete the cache entry.
A better approach would be to instead use a combination of fs.watch
, fs.readFile
and JSON.parse
to reload the changing data. No need to fiddle with the cache or restart the server.
An even better approach would probably be to use a database of some sort instead of the filesystem.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With