Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Event loop for large files?

If I'm not mistaken, I remember the "event loop" model of asynchronous I/O (Node.js, Nginx) not being well-suited for the sake of serving large files.

Is this the case, and if so, are there methods around it? I'm considering writing a real-time file explorer / file server in Node, but files could be anywhere from 100MB to ~3GB. I would assume that the event loop would block until the file is completely served?

like image 734
Dan Simmons Avatar asked Aug 03 '11 12:08

Dan Simmons


People also ask

Why is event loop faster?

In some cases Event loop is giving more performance than with concurrency, because of Non Blocking Behavior . Very good example is Networking applications, because they are using single networking connection resource and processing data only when it is available using Thread Safe Event Loops.

CAN NodeJS handle large data?

The Node. js stream feature makes it possible to process large data continuously in smaller chunks without keeping it all in memory. In other words, you can use streams to read from or write to a source continuously instead of using the traditional method of processing all of it at once.

Is event loop part of V8 or browser?

Since “the event loop” is nothing but a programming pattern, V8 allows the ability to plug-in an external event loop implementation to work with its JavaScript runtime. Using this flexibility, the Chrome browser uses libevent as its event loop implementation, and NodeJS uses libuv to implement the event loop.

How do you loop an event?

The Event Loop has one simple job — to monitor the Call Stack and the Callback Queue. If the Call Stack is empty, the Event Loop will take the first event from the queue and will push it to the Call Stack, which effectively runs it. Such an iteration is called a tick in the Event Loop.


2 Answers

No, it will not be blocked. node.js will read a file in chunks and then send those chunks to the client. In between chunks it will service other requests.

Reading files & sending data over the network are I/O bound operations. node.js will first ask the operating system to read a part of a file and while the OS is doing that node.js will service another request. When the OS gets back to node.js with the data, node.js will then tell the OS to send that data to the client. While the data is being sent, node.js will service another request.

Try it for yourself:

Create a large file

dd if=/dev/zero of=file.dat bs=1G count=1

Run this node.js app

var http = require('http');
var fs = require('fs');

var i = 1;
http.createServer(function (request, response) {
    console.log('starting #' + i++);
    var stream = fs.createReadStream('file.dat', { bufferSize: 64 * 1024 });
    stream.pipe(response);
}).listen(8000);

console.log('Server running at http://127.0.0.1:8000/');

Request http://127.0.0.1:8000/ several times and watch node.js handle them all.

If you're going to serve lots of large files, you may want experiment with differnt bufferSize values.

like image 141
mak Avatar answered Oct 19 '22 13:10

mak


If I'm not mistaken, I remember the "event loop" model of asynchronous I/O (Node.js, Nginx) not being well-suited for the sake of serving large files.

I think you stand correct that node.js is not optimized for serving big files. I advice you to have a look at Ryan Dahl's slides. Especially

Slide 14

Wow. Node sucks at serving large files. Well over 3 second responses for 256 kilobyte files at 300 concurrent connections.

Slide 15

What’s happening: V8 has a generational garbage collector. Moves objects around randomly. Node can’t get a pointer to raw string data to write to socket.

Slide 21

But the fact remains, pushing large strings to socket is slow.

Hopefully this can be mitigated in the future.

are interesting. Maybe this has changed, but I think it would probably be better to use NGinx to serve your static files(or maybe a CDN). I think you are misinformed that NGinx is bad at serving large files. Node.js is(was) a bad at this because of V8 garbage collection, not because of event-loop. Also this link might be interesting.

like image 21
Alfred Avatar answered Oct 19 '22 12:10

Alfred