Memory leaks in nodejs websocket server

I'm using websockets to transfer video files, this means they are large files. The server side (and also the client side) is implemented using nodejs, with binaryjs in javascript.

It worked fine, until I started having a great number of clients, what made the server crash (process was killed by Linux OS). As I observed, it ran out of memory, since for every client it's taking a lot of it, but the thing is that when the client disconnects this memory is not freed. I think this should be done internally and that I should not worry about memory, am I wrong? may I be doing something wrong?

As I've seen, that "send" function is reserving memory to save what it has to send but never frees it. (If you comment that line, there's no memory problem) Here's the code:

var fs = require('fs');
var BinaryServer = require('binaryjs').BinaryServer;
var bs = BinaryServer({port: 8080});

var nchunks=116;

bs.on('connection', function(client){
for(var i=1; i<=nchunks; i++)
{
    var name="/var/www/1.m4s";
    var fd=fs.openSync(name.replace("1.m4s", i+".m4s"), 'r');
    var buf = new Buffer(fs.fstatSync(fd).size, 'binary');
    fs.readSync(fd, buf, 0, buf.length, null)
    client.send(buf);
    fs.closeSync(fd);

    if(i==nchunks){
        client.send("end"); 
    }
}
client.on('close', function(c){
    console.log("closing");
});

});

When the clients receives all of the video files, closes the socket, so I know it's getting closed since I'm capturing "close" event on server. Shouldn't it at this moment free the memory?

The worst thing is, as I couldn't find the error, I thought it might be due to how binaryjs implements it so I tried also with "ws" and "websocket-node" with same results in the memory.

Has anyone experienced this problems? Any idea?

Shouldn't it at this moment free the memory

No, JavaScript is a garbage collected language and the garbage collector runs periodically as the runtime deems suitable. You have no control nor awareness of when or if it will run and thus free memory.

Also, you cannot use any synchronous IO calls in a network server as all of your client processing will block while you do each of those IO calls.

I think your primary problem is you are not streaming the files down in reasonably-small chunks. You are trying to read entire files into memory and send them down.

var buf = new Buffer(fs.fstatSync(fd).size, 'binary');

Don't do that. Use a ReadableStream and send files down in a series of small chunks, and use asynchronous calls. This is how to get node to work correctly for you. Lack of streaming and lack of asynchronous calls are sure paths to failure in node. Here's a working example program.

var fs = require("fs");
var http = require("http");
var server = http.createServer();
server.listen(9200)
server.on('request', function (req, res) {
  fs.createReadStream('/tmp/test1').pipe(res);
});

I tested this on OSX with node v0.10.7 and I can repeatedly request the file with curl localhost:9200 >/dev/null and watching lsof -p <pid of node> I can see the /tmp/test file get opened and closed properly.