Though Node.js is pretty hot topic, I happens to find that it is reported Node.js might not be appropriate for real-time application due to its Garbage Collection model (http://amix.dk/blog/post/19577). And, some benchmark shows that Node.js responds slow compared to RingoJS(http://hns.github.com/2010/09/29/benchmark2.html).
For the time being, Node.js is bound to V8 JavaScript engine which use generational stop-the-world GC.
So, would Node.js be busted when incoming requests are massive? If there is real production statistics, that would be better.
Thanks
The cost of garbage collection depends on the number of objects in the heap, particularly the number of long-lived objects. The more you have, the more time will be spent in GC.
Yes, V8 currently can take some sizable GC pauses sometimes if the heap is large. It sounds like the V8 team is working on minimizing the cost of each GC pause by spreading the work out. You can see the cost of GC in your own node programs by starting it with --trace-gc
.
For many applications, the cost of GC is offset by the increasingly excellent optimizing compiler. I'd suggest trying a simple program and measuring both the cost of GC as reported by V8 as well as measuring the client to client latency. I've found the GC costs to be almost completely ignorable when the clients are connecting over the open Internet.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With