Optimize comparing huge amounts of data with itself in NodeJS

I have a big multidimensional object in nodejs, say 50 MB worth of JSON. It contains variations of biological data. I think I can sufficiently simplify it like so:

{
    lads : {
        // a lad
        lad4515643 : {
            brains  : {
                // a brain
                brain1256251 : {
                    var01 : 'lala',
                    var02 : 'jaja',
                    var99 : 'haha',
                },
                // another brain
                brain3567432 : {},
                brain4867321 : {},
                brain5145621 : {} // etc
            },
            var01 : 'foo',
            var02 : 'bar',
            var99 : 'baz'
        },
        // another lad
        lad4555672 : {},
        lad5625627 : {},
        lad7457255 : {} // etc
    }
}

I need to compare all combinations of lads with brains to all lads with brains to see which ones are "better", in order to make some kind of hierarchy. Some parent lads keys weigh in on the brains comparison.

I figured, using iterations over objects using references we can easily assign IDs of the better ones. Take a quick glance over the code (comments) and you see what I mean:

// Iterate over lads
for (var ladId in obj.lads) {
    if (obj.lads.hasOwnProperty(ladId)) {
        var lad = obj.lads[ladId];

        // Iterate over brains
        for (var brainId in lad.brains) {
            if (lad.brains.hasOwnProperty(brainId)) {
                var brain = lad.brains[brainId];

                // Iterate over lads again
                for (var lad2Id in obj.lads) {
                    if (obj.lads.hasOwnProperty(lad2Id)) {
                        var lad2 = obj.lads[lad2Id];

                        // Iterate over this lads' brains
                        for (var brain2Id in lad2.brains) {
                            if (lad2.brains.hasOwnProperty(brain2Id)) {
                                var brain2 = lad2.brains[brain2Id];

                                // One lad+brain combination
                                var drone1 = {
                                    lad : lad,
                                    brain : brain
                                };

                                // Another lad+brain combination
                                var drone2 = {
                                    lad : lad2,
                                    brain : brain2
                                    ladId : lad2Id, // Required to store the reference if better
                                    brainId : brain2Id // Required to store the reference if better
                                };

                                // Do the comparison unless we are comparing ourselves
                                if (brain != brain2) {
                                    // Objects are passed as reference, so this is convenient:
                                    judge(drone1, drone2);
                                }
                            }
                        }
                    }
                }
            }
        }
    }
}

// Judge who is better
function judge(drone1, drone2) {
    // some magic that compares lad+brain combos
    if (magic) {
        // Add list of better versions
        drone1.brain.better = drone1.brain.better || [];

        // Everything is passed by reference - I can modify the original brain object directly
        drone1.brain.better.push({
            ladId : drone2.ladId,
            brainId : drone2.brainId
        });
    }
}

Now of course, the number of iterations increases exponentially when the dataset increases. With 3000 brains in total, there are already 9 million iterations, which with the magic adds up to more than 10 seconds of execution time.

What optimizations would be (hugely) beneficial in a scenario like this, apart from using multiple threads?

Since judge() is purely math, does it really make a difference if I convert every single step of the iteration to a callback style? (In my imagination, that would only create a huge overhead of anonymous functions and memory usage.)