I am getting acquainted with streams in nodejs and I have a question:
I have what I think is the simplest possible nodejs "echo server" ie a server that simple pipes back to the response stream whatever it receives via the request stream. It works but with a caveat. The client only receives the data back after it closes the submitting stream. Here is the server code:
var http = require('http')
var server = http.createServer(function (req, res) {
req.pipe(res);
});
server.listen(8000);
And here is how I test it:
Doing the following works just fine
term1> node server.js&
term2> echo 'hello world!'|curl --no-buffer --data-binary @- 'http://localhost:8000'
hello world!
However, it only works because echo closes its output file descriptor after being done, ie., the server will not write anything until after its client is done sending stuff:
term2>
term2> yes|curl --no-buffer --data-binary @- 'http://localhost:8000'
(here this line gets stuck for ever)
I would expect that yes
will fill stream buffers pretty fast so I would start seeing y's coming back pretty fast. Unfortunately they never do.
Is this expected? How should I use the streams/pipes so as they have the desired effect? By the way I don't care whether the output would come back in chunks... I understand that that would be results of streams (or the underlying file i/o) doing their buffering magic.
Thank you for your help
For a beginner who wants to get started in the tech industry, learning Node. js and getting relevant certifications can be an effective way to get your career launched. Use the advice above to start your journey, and soon you'll be proficient in this popular (and profitable) runtime environment.
Dedicated reverse proxy tools, like Nginx and HAProxy, typically perform these operations faster than Node. js. Having a web server like Nginx read static content from disk is going to be faster than Node. js as well.
js receives a CPU bound task: Whenever a heavy request comes to the event loop, Node. js would set all the CPU available to process it first, and then answer other requests queued. That results in slow processing and overall delay in the event loop, which is why Node. js is not recommended for heavy computation.
Ok,
I think I figured out what I was doing wrong.
The server.js
was working fine... it was curl
that was breaking the pipeline.
Apparently, the @-
option treats stdin like a file ,ie reads its full content and then submits it. I couldn't find how to make curl
pipe the contents as they are being read. I think its not feasible because curl
wants to treat the contents of the file as a normal HTTP post form value..
Anyway, I modified the example above using a simple nodejs client that was doing what I wanted:
term2> cat client.js
var http = require('http');
var req = http.request('http://localhost:8000', function(res) {
res.pipe(process.stdout);
});
process.stdin.pipe(req);
term2> yes|node client.js
y
y
...
nodejs streams and pipes rock!!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With