Difference between node.js child processes and manual seperated processes?

I'm trying to messure the response time for my socket.io server. So the socket.io server just echoes a message and a testfile sends messages and logs the time needed between sending and getting the echo. For simulating multiple clients I start this testfile multiple times with child_process in a manager file.

The weird thing is, that I get a response time of 0.2 with just a single testfile. When I start the manager generating 4 more clients it gets up to 1.3 and when I start two managers in seperate terminals with 4 clients each it gets up to around 3. But when I start only one manager with 10 clients (or even up to something crazy like 1000) the result stays under 1.

To demonstrate this again:

manager.js 4    -> spawns 4 childs
manager.js 4    -> spawn 4 childs
test.js         -> shows around 3 avg. response time

manager.js 10   -> spawns 10 childs
test.js         -> shows around 0.5 avg. response time

So why are multiple seperate calls to spawn 4 childs more load then one call with many childs?

Here is my managerfile spawning the childs:

var count = process.argv[2] || 1;
console.log("generating " + count + " childs");

for (var i = 0; i < count; i++){
    var childProcess = require('child_process');
    childProcess.exec('node test.js', function (error, stdout, stderr) {
       if (error) {
         console.log(error.stack);
         console.log('Error code: '+error.code);
         console.log('Signal received: '+error.signal);
       }
       console.log('Child Process STDOUT: '+stdout);
       console.log('Child Process STDERR: '+stderr);
     });
}

And this is my testfile messuring the average response time of the last second:

var io = require('socket.io-client');
var rtts = [];

var socket = io.connect('http://localhost:3000');
socket.emit('roundtrip', { time: new Date().getTime() });
socket.on('roundtrip', function (data) {

var roundtripTime = new Date().getTime() - data.time;
rtts.push(roundtripTime);

socket.emit('roundtrip', { time: new Date().getTime() });
});

setInterval(function(){
    avgRTT = 0;
    for (var i = 0; i < rtts.length; i++){
        avgRTT += rtts[i];
    }
    avgRTT = avgRTT / rtts.length;
    console.log("AVG RTT: " + avgRTT);
    rtts = [];
}, 1000);

I'm running this on linux mint 64 bit.