More efficient way to build/transfer a large data structure?

I'm building a mobile Boggle-type web app with node.js. I'm trying find a more efficient way to load/build a massive dictionary (180,000+ words). I currently have it working but the load time is slightly long. Users have to wait about 15 seconds for the entire thing to build and some users time-out before the entire thing has loaded. I was wondering if anyone has any tips to improve the speed.

The way I'm currently doing this (which is probably completely inefficient):

  • I broke down the list into 26 arrays, one for each letter, and stuck each array in it's own javascript file.
  • When the app loads it runs a recursive function which gets the next js file and loads in the array from it overwriting the previous one. And then, it loops through the entire array and loads each new word into my Trie datastructure.
  • The files with the arrays in them combined are around 2mb. After being combined the datastructure itself clocks in at round 12mb, which isn't so bad on a desktop computer, but does weigh down a couple of my users' smartphones.

This needs to be built on the client side to allow instant lookups. The way I'm doing it currently works but I know there has to be a better way.

The other other tactic is to convert your recursive code into non-recursive code that uses an explicit stack, saving only the objects you actually need.

Have you tried profiling your code?

To answer the question of the fastest loading time, are you doing it in this fashion? (aka, without more code, we can't possibly know)

function LoadFiles(fileArray){
  file = fileArray.slice(); //get the first file.
  $.ajax(file).success(function(data){
    /* yes, my object is a little funky, I'm focused on writing pseudocode */
    wordLibraryAdd(data);

    if (fileArray.length) // on a zero length quit processing
    setTimeout(function(){ LoadFiles(fileArray) }, 50) //a 50 ms buffer between each loading isn't bad.
  })
}