I have an array of fs.writeFile png jobs with the png headers already removed like so
canvas.toDataURL().replace(/^data:image\/\w+;base64,/,"")
jobs array like this
jobs=[['location0/file0'],['location1/file1'],['location2/file2'],['location3/file3']];
I have just started to use async and was looking at their docs and there are lots of methods
queue looks interesting and parallel..
Right now I handle my jobs (in a async.waterfall) like so
function(callback){//part of waterfall
(function fswritefile(){
if(jobs.length!==0){
var job=jobs.shift();
fs.writeFile(job[0],(new Buffer(job[1],'base64')),function(e){if(e){console.log(e);}else{fswritefile();}})
}
else{callback();}
})();
},//end of waterfall part
Could this be done more efficiently/faster using this module?
async.waterfall
will process jobs sequentially. I think you could do everything in parallel with async.each:
async.each(jobs, function (job, done) {
var data = new Buffer(job[1],'base64');
fs.writeFile(job[0], data, done);
}, function (err) {
// …
});
All jobs will start everything in parallel. However, node.js always limits the number of concurrent operations on the disk to 4.
EDIT: No matter what you do, node.js will limit the number of concurrent operations on the fs. The main reason is that you have only have 1 disk and it would be inefficient to attempt more.