i'm developing high-load testing tool, using eventsource npm library to generate concurrent sse connections.
I use CentOS on digitalocean with 512Mb and 1CPU.
Take a look at my code:
var http = require("http"),
EventSource = require('eventsource'),
openSockets = 0;
http.globalAgent.maxSockets = 70000;
console.log(process.memoryUsage() )
beginTest();
function beginTest (options) {
for (var i = 0; i < 50000; i++) {
setTimeout(formPoster(i),0);
}
}
function formPoster (i) {
var url = "http://remoteserver.com/examples/events/connect.sse",
es = new EventSource(url);
es.onmessage = function(e) {
};
es.onerror = function(e) {
console.log(e);
};
es.onopen = function(e) {
openSockets++;
if (openSockets == 1 ) {
console.log(process.memoryUsage() );
process.exit(code=0);
}
};
}
The problem with that code is huge memory leak - first console.log statement outputs
{ rss: 8925184, heapTotal: 5066496, heapUsed: 2113192 }
whereas the second one (after first connected ) outputs
{ rss: 373219328, heapTotal: 296538752, heapUsed: 286450700 } herewith remoteserver has around 1500 connection with local machine.
What reasons can cause such memory leaks?
You don't have a memory leak - you are simply instantiating a lot of objects at the same time and a garbage collection event has probably not yet occurred.
The line setTimeout(formPoster(i),0) isn't doing what you think - because you're formPoster, rather than passing a reference, the timeout has no effect. You are simply calling formPoster 50,000 times immediately.
In fact, even if you did pass a function to setTimeout for deferred execution like this:
setTimeout(function() {
formPoster(i)
})
Then you would still see the same effect, just deferred to a later execution of the event loop.