I'm using websockets to transfer video files, this means they are large files. The server side (and also the client side) is implemented using nodejs, with binaryjs in javascript.
It worked fine, until I started having a great number of clients, what made the server crash (process was killed by Linux OS). As I observed, it ran out of memory, since for every client it's taking a lot of it, but the thing is that when the client disconnects this memory is not freed. I think this should be done internally and that I should not worry about memory, am I wrong? may I be doing something wrong?
As I've seen, that "send" function is reserving memory to save what it has to send but never frees it. (If you comment that line, there's no memory problem) Here's the code:
var fs = require('fs');
var BinaryServer = require('binaryjs').BinaryServer;
var bs = BinaryServer({port: 8080});
var nchunks=116;
bs.on('connection', function(client){
for(var i=1; i<=nchunks; i++)
{
var name="/var/www/1.m4s";
var fd=fs.openSync(name.replace("1.m4s", i+".m4s"), 'r');
var buf = new Buffer(fs.fstatSync(fd).size, 'binary');
fs.readSync(fd, buf, 0, buf.length, null)
client.send(buf);
fs.closeSync(fd);
if(i==nchunks){
client.send("end");
}
}
client.on('close', function(c){
console.log("closing");
});
});
When the clients receives all of the video files, closes the socket, so I know it's getting closed since I'm capturing "close" event on server. Shouldn't it at this moment free the memory?
The worst thing is, as I couldn't find the error, I thought it might be due to how binaryjs implements it so I tried also with "ws" and "websocket-node" with same results in the memory.
Has anyone experienced this problems? Any idea?