Why did deserializing a serialized copy of my target array fix mem leak issues?

I have a 2 functions used for de-dup'ing and searching arrays. It's trivial stuff. I noticed that in some cases for arrays with over about 100 elements a huge memory leak would occur when using these functions and my node.js javascript app would die.

What could possibly be "wrong" with my array arr that would mean deserializing a serialized version of it fixes the mem leak? And why did deserializing a serialized copy of my target array fix mem leak issues?

Possibly related:

arr is built with a few dozen calls to concat.

The search function:

function findObjInArray(arr, obj, lookupKeyChain){
    var tmpObj = undefined;
    for(var i = 0; i < arr.length; i++){
        tmpObj = arr[i];
        for(var j = 0; j < lookupKeyChain.length; j++){
            tmpObj = tmpObj[lookupKeyChain[j]]
            if(!tmpObj){
                break;
            }else if(j==lookupKeyChain.length-1){
                if(tmpObj==obj){
                    return arr[i];
                }
            }
        }
    }
    return undefined;
}

The de-dupe function:

function combineProducts(productList, idKey){
    var deDupedProducts = []
    for(var i = 0; i < productList.length; i++){
        var precedent = findObjInArray(deDupedProducts, productList[i][idKey], [idKey]);
        if(precedent){
            //just add tag data to precedent
            for(var j = 0; j < productList[i].tags.length; j++){
                precedent.tags.push(productList[i].tags[j]);
            }
        }else{
            deDupedProducts.push(productList[i]);
        }
    }
    return deDupedProducts;
}

An example of the structure in arr:

    [
        {
            "price": "$9.99",
            "name": "Big Widgets",
            "tags": [
                {
                    "tagClass": "Category",
                    "tagName": "On Sale"
                }
            ]
        },
        {
            "price": "$5.00",
            "name": "Small Widgets",
            "tags": [
                {
                    "tagClass": "Category",
                    "tagName": "On Sale"
                },

            ]
        },
        ...
    ]

The call that causes the mem leak:

combineProducts(
    arr,
    "name"
)

The call that fixed the issue and gave me the correct result:

combineProducts(
    JSON.parse(JSON.stringify(arr)),
    "name"
)

Unrelated, but an object-based algorithm is much more efficient (and concise) for large lists than your ever-expanding linear search.

function combineProducts(productList, idKey) {
    var lookup = {};

    productList.forEach(function(product) {
        var precedent = lookup[product[idKey];

        if (precedent) {
            precedent.tags = precedent.tags.concat(product.tags);
        }
        else {
            lookup[product[idKey]] = product;
        }
    });

    return Object.keys(lookup).map(function(idValue) {
        return lookup[idValue];
    });
}

Only difference to your function is that ordering is not preserved (though if the data was ordered by the idKey to start with, a single-pass algorithm would be even better).