I'm doing some load testing and writing node scripts to do that. My results were pretty poor, which freaked me out until I realized that my test code was to blame. I'm averaging about 30-50 requests per second (into the server) use the code below to make the requests. 30-50 seems awfully low. This is on a 4 core Mac. Is this right or am I doing something totally wrong?
var http = require('http');
var sys = require('util');
http.globalAgent.maxSockets = 100000;
var http = require('http');
http.globalAgent.maxSockets = 100000;
var Request = function (request, params, host, port, completionFn, errorFn)
{
if(!params)
params = '';
if(typeof(errorFn) != 'function')
{
errorFn = function (e)
{
console.log('request error!? ' + e.message);
process.exit();
}
}
var paramsStr = '';
for(var item in params)
paramsStr += '&' + item + '=' + encodeURI(params[item]);
var path = '/' + request;
if(paramsStr != '')
path += '?' + paramsStr.substr(1);
var options =
{
host: host,
port: port,
path: path,
agent: false
};
http.request(options,
function (response)
{
var responseData = '';
response.on('data',
function (chunk)
{
responseData += chunk;
}
).on('end',
function ()
{
completionFn(httpRequest.To_JSON(responseData));
}
).on('error', errorFn);
}
).on('error', errorFn).end();
};
New info:
Interestingly enough running this in Chrome nets me about 250 requests per second which seems more reasonable for a single node. Though the browser does crash pretty quickly.
for(var i = 0; i < 1000000; i++)
{
$.get('/service', {index:i},function(result){}).error(
function()
{
out.append('fail ');
}
);
}
/*jshint node:true */
var http = require('http');
var i = 100;
var called = 0;
while (i--) {
requestGoogle();
}
function requestGoogle() {
'use strict';
var requestNum = called++;
console.log('CALLED: ' + called);
var options = {
hostname: 'www.google.com',
port: 80,
path: '/',
method: 'GET'
};
var req = http.request(options, function (res) {
res.setEncoding('utf8');
res.on('data', function () {
console.log('RECIEVING DATA ON: ' + requestNum);
});
res.on('end', function () {
console.log('CALLED: ' + --called);
});
});
req.on('error', function (e) {
console.log('problem with request: ' + e.message);
});
req.end();
}
So, when I run this example I notice two things. First is that at the beginning I'm getting a good mix of data coming on for various requests:
RECIEVING DATA ON: 2 RECIEVING DATA ON: 0 RECIEVING DATA ON: 2 RECIEVING DATA ON: 0 RECIEVING DATA ON: 3 RECIEVING DATA ON: 3 RECIEVING DATA ON: 3 RECIEVING DATA ON: 3 RECIEVING DATA ON: 3 RECIEVING DATA ON: 3 RECIEVING DATA ON: 3 RECIEVING DATA ON: 1 RECIEVING DATA ON: 1 RECIEVING DATA ON: 1 RECIEVING DATA ON: 1 RECIEVING DATA ON: 1 RECIEVING DATA ON: 1 RECIEVING DATA ON: 1 RECIEVING DATA ON: 1 RECIEVING DATA ON: 4 RECIEVING DATA ON: 4 RECIEVING DATA ON: 4
We are seeing chunks from request 1, 4, 3, 0, 2... all coming in simultaneously... AWESOME RIGHT! Well, after a while this behavior distills into what ends up being a very linear set of results, with zero mixing happening at all. Eventually we crowd the event loop with queued requests. When we do this logic server side, it isn't this immediate looping function call, to queue requests up, literally instantaneously. No matter how many requests coming into a server(except under DDOS conditions), there are precious few cycles for the node process to call it's event handlers, and come back to the main event loop.
When you do a client, while loop, spawning requests, these precious few cycles are not available. INstead, we have a crowded event loop, that ends up resulting in an essentially linear handling of your requests, and once the intial 10 - 13 requests have received data, they start receiving it in order, and things happen much slower than you would expect, compared to how quickly node can handle server side requests, because we have an overcrowded event loop. Your client program has essentially DOSed itself! :).
You can attempt to curb this behavior by increasing the number of sockets, however, ultimately your machine will run out of file descriptors. At this point node will either crash/call error handlers on the calls involved, or slow down and wait for file descriptors to become available, resulting in the handling of future requests in a very linear, synchronous fashion. There are ways to increase your file descriptors, but ultimately, to properly benchmark, you need more than one machine. No operating system is designed to handle enough socket descriptors/network I/O to adequately stress test even a single node server. If this were the case, DDOS attacks would be much easier!
ALSO: Look into JMeter, it's much better for the client side of stress testing than node is.