How to apply async on for loop of range?

As I understand, async is working only on array.

My application is reading a file of 1.2GB, and I want to read it in parts of 1024KB. Because RAM issue, I want to read 10 parts every time. From the documentation, eachlimit(arr, 10, iterator, callback) is the right function for me.

The problem is that I can't put all the parts in the array. This, because If I would do it, the Ram issue will raise, and the eachSeries is redundant.

In other words. I want to switch the following loops:

    for (var rangeStart = 0; rangeStart < stats.size; rangeStart = rangeStart + partSize) {
   //Where stats.size = 1200000000; partsize = 1024000, put the part of the file in the range into a buffer
}

to sync version, so that I complete every time 10 loops, and just then continue.

You don't need async to transform & upload large files to S3.

Simply stream the (large) file, do what ever transformation you need and pipe the result directly to amazon S3 using Knox

If you need a detailed example of how to do this, see: https://www.npmjs.org/package/stream-to-s3 (I wrote a quick node module to illustrate it for you =)

Installation:

npm install stream-to-s3

Usage:

var S = require('stream-to-s3');
var file = __dirname+'/your-large-file.txt';

S.streamFileToS3(file, function(){
  console.log('Awesomeness', file, 'was uploaded!');
  console.log('Visit:',S.S3FileUrl(file));
});

Done.

More detail on GitHub: https://github.com/nelsonic/stream-to-s3