Handle ajax response with node.js

I am trying to scrape information from a specified website. This site uses authentication first, thus a i use zombie.js:

var Browser = require("zombie");
var browser = new Browser();
browser.visit("https://*****login.aspx", function(){
    browser.fill('#user', '*****');
    browser.fill('#pwd', '*****');
    var button = browser.querySelector('#btnSubmit');
    browser.fire('click', button, function(){
    //scraping main.aspx
    });
});

It's working, i can scrape the main.aspx: there is a <table>, containig information about new messages(from, date, subject,), the problems comes here: the subject field is clickable, and clicking on it makes a new window appear with the actual message. However it is an ajaxgrid, and when i perform a click:

var field = browser.querySelector('#VeryLongIdOfTheField');
browser.fire('click', field, function(){    
    console.log(browser.querySelector('#VeryLongIdOfTheFieldContainingTheMessage').innerHTML);
});

it returns an error message, saying that undefined has no innerHTML. I suppose its because this action handled with some ajax magic. I am new in this js/nodejs/jquery/.. world, some help needed to enlight me.

Since the data is populated using async ajax, I'm guessing there's a lag between your click and the actual DOM population inside the node. How about waiting for a bit before checking the content inside the node.

browser.fire('click', field, function(){
  setTimeout(function(){
    console.log(browser.querySelector('#VeryLongIdOfTheFieldContainingTheMessage').innerHTML);
  }, 3000)
});

If the time taken is not very predictable, you could also run it inside a loop until you find the content or exit after a reasonable number of retries.