NodeJS How To Confirm MongoDB Driver Is Not Bkocking

I've asked some general questions around this topic before (node and blocking). This time the question is a little more specific.

Let's say I've got a node/express app which has a handle that is accepting HTTP requests (doesn't matter, say they're simple GETs).

And it has a separate handler which reads messages off of a RabbitMQ queue, as they arrive, and then does a read from Mongo (Mongo is on a different machine), followed by a write.

If Mongo was "very" busy, would/could that cause the HTTP handler to appear unavailable?

I'm using the Mongo native driver. I would think any blocking that is occurring while the Mongo driver waits for a response from the server would have Node happily accepting and handling HTTP requests, but I don't know for sure.

In a related scenario, swap-out a busy Mongo for a handler that reads a Rabbit message and PUTs a record into a "very" busy ElasticSearch. Will that cause issues with the HTTP handler?

I'd go straight to testing it, but that's a little tricky and gets expensive testing every time I'm not sure what the theory is. So I thought I'd ask.

Here's a (simplified) example of the code:

// HTTP handler...
app.post('/eventcapture/event', (req: express.Request, res: express.Response) => {
    var evt: eventDS.IEvent = ('TypeID' in req.body) ? req.body : JSON.parse(req.body);

    //create an id
    evt._id = uuid.v4();

    bus.Publish(evt)
        .then((success) => {
            res.jsonp(200, { success: true });
        })
        .catch((failReason:Error) => {
            console.error('[ERROR] - Failure writing event: %s,%s', failReason.name, failReason.message);
            logError(failReason, evt);

            res.jsonp(500, { success: false, reason: failReason });
        });
});

// We generically define additional handlers in an array, and then kick them off with a loop.
// Here we have one handler which reads an event, goes to mongo to get additional data which
// it adds into the event before publishing it back out.  And a second handler which will catch
// these "augmented" events and push them into Mongo
var processes = [
    {
        enabled: true,
        name: 'augmenter',
        inType: 'EventCapture:RawEvent',
        handler: (event: eventDS.IEvent) => {
            console.log('[LOG] - augment event: %s', event._id);

            Profile.FindOne({ _id: event.User.ProfileID })
                .then((profile) => {
                    if (profile) {
                        console.log('[LOG] - found Profile: %s', profile._id);
                        event.User.Email = profile.PersonalDetail.Email;
                        //other values also...

                        //change the TypeID for publishing
                        event.TypeID = 'EventCapture:AugmentedEvent';

                        return event;
                    }
                    else throw new Error(util.format('unable to find profile: %s', event.User.ProfileID));
                })
                .then((augmentedEvent) => bus.Publish(augmentedEvent)) //publish the event back out
                .catch((failReason:Error) => {
                    console.error('[ERROR] - failure publishing augmented event: %s, %s, %s', event._id, failReason.name, failReason.message);
                    logError(failReason, event);
                });
        }
    },
    {
        enabled: true,
        name: 'mongo',
        inType: 'EventCapture:AugmentedEvent',
        handler: (event: eventDS.IEvent) => {
            console.log('[LOG] - push to mongo: %s', event.User.ProfileID);

            Event.Save(event, { safe: true })
                .then((success) => console.log('[LOG] - pushed to mongo: %s', event._id))
                .catch((failReason:Error) => {
                    console.error('[ERROR] - failure pushing to mongo: %s, %s', event._id, failReason);
                    logError(failReason, event);
                });
        }
    }
];

processes.forEach((process, idx, allProcesses) => {
    if (process.enabled) {
        bus.Subscribe(process.name, process.inType, process.handler);
    }
});

No. This is the awesomeness of async programming. Node can do other things while it waits for mongodb to get back to it. You can assume that popular node modules like mongodb write things in an async fashion.

Here's a video that goes into a lot of detail about the event loop: http://vimeo.com/96425312?utm_source=nodeweekly&utm_medium=email

At the end of the day, things like the mongo driver are written using node's low level io and network libraries. These libraries enforce async flow. The author of a package would have to go out of her way to make it sync.