I have the following piece of code:
var schedules = io.on('connection', function(client) {
var schJSON = JSON.parse(fs.readFileSync(__dirname +'huge_file.json', 'utf8'));
client.json.send(schJSON);
});
readFileSync being a blocking call, I figured that while one client's request is being processed by the server, other clients would get queued up. So, if reading the file takes around 10 seconds and I fire off three different connections to the server, the third connection would take around 30 seconds to get a response.
In practice, all three clients get the response almost at the same time (after 10 seconds have elapsed). The three requests were fired from three different machines (with the same external IP address).
How is this possible?
I would expect the flow to go something like this:
client.json.send call starts sending the response.readFileSync call completing very quickly since the file is in the disk cache and then client.json.send is called so that all three requests' responses are being sent in parallel.