Performing piped operations on individual chunks (node-wav)

I'm new to node and I'm working on an audio stream server. I'm trying to process / transform the chunks of a stream as they come out of each pipe.

So, file = fs.createReadStream(path) (filestream) is piped into file.pipe(wavy) (remove headers and output raw PCM) gets piped in to .pipe(waver) (add proper wav header to chunk) which is piped into .pipe(spark) (ouput chunk to client).

The idea is that each filestream chunk has headers removed if any (only applies to first chunk), then using the node-wav Writer that chunk is endowed with headers and then sent to the client. As I'm sure you guessed this doesn't work.

The pipe operations into node-wav are acting on the entire filestream, not the individual chunks. To confirm I've checked the output client side and it is effectively dropping the headers and re-adding them to the entire data stream.

From what I've read of the Node Stream docs it seems like what I'm trying to do should be possible, just not the way I'm doing it. I just can't pin down how to accomplish this.

Is it possible, and if so what am I missing?

Complete function:

processAudio = (path, spark) ->
  wavy = new wav.Reader()
  waver = new wav.Writer()
  file = fs.createReadStream(path)
  file.pipe(wavy).pipe(waver).pipe(spark)

I don't really know about wavs and headers but if you're "trying to process / transform the chunks of a stream as they come out of each pipe." you can use the Transform stream.

It permits you to sit between 2 streams and modify the bytes between them:

var util = require('util');
var Transform = require('stream').Transform;
util.inherits(Test, Transform);

function Test(options) {
  Transform.call(this, options);
}

Test.prototype._transform = function(chunk, encoding, cb) {
  // do something with chunk, then pass a modified chunk (or not)
  // to the downstream
  cb(null, chunk);
};

To observe the stream and potentially modify it, pipe like:

file.pipe(wavy).pipe(new Test()).pipe(waver).pipe(spark)