Found this issue debugging code where the following did not work:
var req = http.request(options,function(res){
// res.on('error',cb(err));
res.on('end',function(){
cb();
});
});
However the following did work:
var req = http.request(options,function(res){
// res.on('error',cb(err));
res.on('data',function(chunk){
//why do we need this?
});
res.on('end',function(){
cb();
});
});
The res
variable is a Readable Stream. If you click the link and scroll down to 'end' event you might find the following:
Note that the 'end' event will not fire unless the data is completely consumed.
By adding the 'data' event handler, you consume the data.
It's how node's backpressure mechanism works. If the response stream's buffer fills up, it tells the server to stop sending data (this is handled at the TCP layer). So once you start reading data (via res.read()
or attaching a data
handler or simply using res.resume()
), more data from the server is transferred until some point in which there is no more data. Only once the server has no more data to send will you get an end
event. I typically use res.resume();
since it's a lot shorter.
This behavior has existed since node v0.10. Before that, you could actually lose data if you did not attach a data
handler right away, so as you can imagine that caused a problem for a lot of people. So with node v0.10+ the default behavior is to pause until you start reading (this is at the node streams layer, separate from the network).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With