Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Node.js EventEmitter events not sharing event loop

Perhaps the underlying issue is how the node-kafka module I am using has implemented things, but perhaps not, so here we go...

Using the node-kafa library, I am facing an issue with subscribing to consumer.on('message') events. The library is using the standard events module, so I think this question might be generic enough.

My actual code structure is large and complicated, so here is a pseudo-example of the basic layout to highlight my problem. (Note: This code snippet is untested so I might have errors here, but the syntax is not in question here anyway)

var messageCount = 0;
var queryCount = 0;

// Getting messages via some event Emitter
consumer.on('message', function(message) {
    message++;
    console.log('Message #' + message);

    // Making a database call for each message
    mysql.query('SELECT "test" AS testQuery', function(err, rows, fields) {
        queryCount++;
        console.log('Query   #' + queryCount);
    });
})

What I am seeing here is when I start my server, there are 100,000 or so backlogged messages that kafka will want to give me and it does so through the event emitter. So I start to get messages. To get and log all the messages takes about 15 seconds.

This is what I would expect to see for an output assuming the mysql query is reasonably fast:

Message #1
Message #2
Message #3
...
Message #500
Query   #1
Message #501
Message #502
Query   #2
... and so on in some intermingled fashion

I would expect this because my first mysql result should be ready very quickly and I would expect the result(s) to take their turn in the event loop to have the response processed. What I am actually getting is:

Message #1
Message #2
...
Message #100000
Query   #1
Query   #2
...
Query   #100000

I am getting every single message before a mysql response is able to be processed. So my question is, why? Why am I not able to get a single database result until all the message events are complete?

Another note: I set a break point at .emit('message') in node-kafka and at mysql.query() in my code and I am hitting them turn-based. So it appears that all 100,000 emits are not stacking up up front before getting into my event subscriber. So there went my first hypothesis on the problem.

Ideas and knowledge would be very appreciated :)

like image 602
Eric Olson Avatar asked May 04 '15 16:05

Eric Olson


1 Answers

The node-kafka driver uses quite a liberal buffer size (1M), which means that it will get as many messages from Kafka that will fit in the buffer. If the server is backlogged, and depending on the message size, this may mean (tens of) thousands of messages coming in with one request.

Because EventEmitter is synchronous (it doesn't use the Node event loop), this means that the driver will emit (tens of) thousands of events to its listeners, and since it's synchronous, it won't yield to the Node event loop until all messages have been delivered.

I don't think you can work around the flood of event deliveries, but I don't think that specifically the event delivery is problematic. The more likely problem is starting an asynchronous operation (in this case a MySQL query) for each event, which may flood the database with queries.

A possible workaround would be to use a queue instead of performing the queries directly from the event handlers. For instance, with async.queue you can limit the number of concurrent (asynchronous) tasks. The "worker" part of the queue would perform the MySQL query, and in the event handlers you'd merely push the message onto the queue.

like image 116
robertklep Avatar answered Sep 21 '22 02:09

robertklep