For a log viewer, I have to write the updates in a mongoDB collection on a sockJS socket.
Based on this post, I'm using the stream cursor of the native nodeJS driver to do so, but it only works for the content present in the collection when I create the stream. None of the further updates are written.
Here is my code:
var server = new mongodb.Server(config.db_host, config.db_port, {});
var DB = new mongodb.Db('myLogs', server, {w:0}).open(function (error, database) {
if (error) throw error;
db.logs = database.collection('logs');
});
var stream = db.logs.find({user: sID}, {sort: [['_id', 'asc']]}).stream();
stream.on('error', function (err) {
socket.write(JSON.stringify({'action': 'log','param': 'log db streaming error'}));
});
stream.on('data', function (doc) {
socket.write(JSON.stringify({'action': 'log','param': doc.log}));
});
What am I doing wrong? Can this work?
If you have a capped collection, you can use the TailableCursor, which does what you want. A standard CursorStream returns only the results that matched at the time of the call to find (as you've seen).
There isn't a wealth of info for doing this in Node.JS though. Here's a pointer in the right direction.
While I haven't tested this code, it should be something like below. The key is using a capped collection and setting the tailable and awaitdata option to true.
var stream = db.logs.find({user: sID}, {
tailable: true,
awaitdata: true
/* other options */
}).stream();
stream.on('data', function (doc) {
socket.write(JSON.stringify({'action': 'log','param': doc.log}));
});