I'm new to Node and trying to ensure that I'm using sane designs for a JSON-driven web app.
I've got a bunch of data stored in Redis and am retrieving it through node, streaming out the results as they come from Redis. Here's a good example of what I'm doing:
app.get("/facility", function(req, res) {
rc.keys("FACILITY*", function(err, replies) {
res.write("[");
replies.forEach(function (reply, i) {
rc.get(reply, function(err, reply) {
res.write(reply);
if (i == replies.length-1) {
res.write("]");
res.end();
}
else
res.write(",");
});
});
});
});
Essentially I'm getting set of keys from Redis and then requesting each one, streaming out the result into semi-manually created JSON (the strings coming out of Redis are already in JSON). Now this works nicely, but I can't help thinking that the i == replies.length-1 is a little untidy?
I could do all this with mget in Redis, but that isn't really the point I'm trying to get it; it's how best to handle async looping with forEach, streaming the output and gracefully closing off the connection with res.end with the looping is done.
Is this the best way, or is there a more elegant pattern I could follow?
Which Async patterns should we use? For simple projects, callbacks are the simplest and easier way to handle async flows. On bigger projects with a proper setup, I would choose the async/await pattern, as the asynchronicity is easy to read, has a natural error handling and there's no pyramid of death.
The three patterns discussed here are callbacks, promises, and async/await. There are other patterns as well as multiple variations of each so this post might expand in the future.
Node. js favors asynchronous APIs because it is single-threaded. This allows it to efficiently manage its own resources, but requires that long-running operations be non-blocking, and asynchronous APIs are a way to allow for control of flow with lots of non-blocking operations.
The above code might not do what you expect. You're kicking off each .get()
in sequence, but they might not call back in sequence — so the results could stream out in any order. If you want to stream the results instead of collecting them in memory, you need to .get()
in sequence.
I think that caolan’s async library makes a lot of this easier. Here’s one way you could use it to get each item in sequence (warning, untested):
app.get("/facility", function(req, res) {
rc.keys("FACILITY*", function(err, replies) {
var i = 0;
res.write("[");
async.forEachSeries(replies, function(reply, callback){
rc.get(reply, function(err, reply) {
if (err){
callback(err);
return;
}
res.write(reply);
if (i < replies.length) {
res.write(",");
}
i++;
callback();
});
}, function(err){
if (err) {
// Handle an error
} else {
res.end(']');
}
});
});
});
If you don’t care about the order, just use async.forEach()
instead.
If you wouldn’t mind collecting the results and want them to return in sequence, you could use async.map()
like this (warning, also untested):
app.get("/facility", function(req, res) {
rc.keys("FACILITY*", function(err, replies) {
async.map(replies, rc.get.bind(rc), function(err, replies){
if (err) {
// Handle an error
} else {
res.end('[' + replies.join(',') + ']');
}
});
});
});
You can use the async library, it provides some handy methods for looping, such as forEach:
forEach(arr, iterator, callback)
Applies an iterator function to each item in an array, in parallel. The iterator is called with an item from the list and a callback for when it has finished. If the iterator passes an error to this callback, the main callback for the forEach function is immediately called with the error.
Note, that since this function applies the iterator to each item in parallel there is no guarantee that the iterator functions will complete in order.
Example
// assuming openFiles is an array of file names and saveFile is a function
// to save the modified contents of that file:
async.forEach(openFiles, saveFile, function(err){
// if any of the saves produced an error, err would equal that error
});
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With