SocketIO memory leak if clients never accept messages

We've been using a load balancing desktop tool in the office that can establish a connection with socket.io and send a message, but after 50 concurrent connections the app runs out of memory and is killed by the server. I have my suspicions that because the desktop app isn't sending an acknowledgement of the message, socket.io is queuing them up in memory and it quickly falls over (30 seconds or so).

Would this be the case? How can I mitigate these "bad" websocket connections?