I have a rest api that receives files.Each file that my node-application stores in the local storage is added to a xml & json document. How can I prevent, that the app doesn't write two files at almost the same time in the document, what would cause an invalid document.
At the moment I put all files into an array and store only the last one from the array. That prevents that the xml/json documents are only accessed once.
this.files = new Array();
this.running = false;
this.app.push('/file', function(req, res){
this.files.add(req.body.file);
if(!this.running)
this.storeFile();
}
this.storeFile = function(){
this.running = true;
fs.write(files[files.length-1], this.succ, this.err);
this.succ(){
files.splice(fileIndex);
if files.length greater than 0
app.storeFile();
else
this.running = false;
}
This doesn't prevent, that several files are written at the same time. If running = false and two push events come in at almost the same time. The loop is executed twice and each file is stored twice. And the xml/json file is accessed/opened/written twice which causes an error/exception.
This is a how to handle concurrency by using shared resources type of question. The fastest way to solve this problem is using a database where it is already solved by other developers with ACID compliant transactions.
Another solutions depend on the structure of your code and your expectations. You can possible lock the file for write or use a semaphore to control access to the file, but since your are using and rewriting the data stored by the file, it can occur that some of the changes will be lost, or some of the locks will last until timeout, etc... Another approach to avoid this problem is event sourcing, in which you are always appending the file, but never overriding.