Piping multiple file streams using Node.js

I want to stream multiple files, one after each other, to the browser. To illustrate, think of having multiple CSS files which shall be delivered concatenated as one.

The code I am using is:

var directory = path.join(__dirname, 'css');
fs.readdir(directory, function (err, files) {
  async.eachSeries(files, function (file, callback) {
    if (!endsWith(file, '.css')) { return callback(); } // (1)

    var currentFile = path.join(directory, file);
    fs.stat(currentFile, function (err, stats) {
      if (stats.isDirectory()) { return callback(); } // (2)

      var stream = fs.createReadStream(currentFile).on('end', function () {
        callback(); // (3)
      });
      stream.pipe(res, { end: false }); // (4)
    });
  }, function () {
    res.end(); // (5)
  });
});

The idea is that I

  1. filter out all files that do not have the file extension .css.
  2. filter out all directories.
  3. proceed with the next file once a file has been read completely.
  4. pipe each file to the response stream without closing it.
  5. end the response stream once all files have been piped.

The problem is that only the first .css file gets piped, and all remaining files are missing. It's as if (3) would directly jump to (5) after the first (4).

The interesting thing is that if I replace line (4) with

stream.on('data', function (data) {
  console.log(data.toString('utf8'));
});

everything works as expected: I see multiple files. If I then change this code to

stream.on('data', function (data) {
  res.write(data.toString('utf8'));
});

all files expect the first are missing again.

What am I doing wrong?

PS: The error happens using Node.js 0.8.7 as well as using 0.8.22.

UPDATE

Okay, it works if you change the code as follows:

var directory = path.join(__dirname, 'css');
fs.readdir(directory, function (err, files) {
  var concatenated = '';
  async.eachSeries(files, function (file, callback) {
    if (!endsWith(file, '.css')) { return callback(); }

    var currentFile = path.join(directory, file);
    fs.stat(currentFile, function (err, stats) {
      if (stats.isDirectory()) { return callback(); }

      var stream = fs.createReadStream(currentFile).on('end', function () {
        callback();
      }).on('data', function (data) { concatenated += data.toString('utf8'); });
    });
  }, function () {
    res.write(concatenated);
    res.end();
  });
});

But: Why? Why can't I call res.write multiple times instead of first summing up all the chunks, and then write them all at once?

The code was perfectly fine, it was the unit test that was wrong ...

Fixed that, and now it works like a charme :-)