I'm using a database and could load the data every request from the database, but it would be too slow maybe. So my idea was to hold data live on the server (as servers usually do). This idea works great using only one server. Now, when a second server comes into space the handling of this situation gets much more complicated. Please hold in mind that the servers might be seperated physically.
What is the best practice to ensure that all data on two or more node.js servers are identical without using a database?
My first idea was to have a super object manager per server instance. This object manager would receive a unique id (for all servers, stored in database) for each object that I create/load (for example a user out of the database). Lets say the object of the user has the id 3h34ds. Now, when I update my user object, the object manager handles the change and broadcasts a message to all other servers including the object id (which is global unique) and metadata (e. g. JSON).
Is this reliable? Is this accomplishable? I don't think so, because I'd do notify the object manager every time when I change something. Indeed this would break code flow, e. g.
user.name = "Test";
objectManager.notify(user); // this is unlovely
Are there better ways?