Node.js request stream ends/stalls when piped to writable file stream

I'm trying to pipe() data from Twitter's Streaming API to a file using modern Node.js Streams. I'm using a library I wrote called TweetPipe, which leverages EventStream and Request.

Setup:

var TweetPipe = require('tweet-pipe')
  , fs = require('fs');

var tp = new TweetPipe(myOAuthCreds);
var file = fs.createWriteStream('./tweets.json');

Piping to STDOUT works and stream stays open:

tp.stream('statuses/filter', { track: ['bieber'] })
  .pipe(tp.stringify())
  .pipe(process.stdout);

Piping to the file writes one tweet and then the stream ends silently:

tp.stream('statuses/filter', { track: ['bieber'] })
  .pipe(tp.stringify())
  .pipe(file);

Could anyone tell me why this happens?

it's hard to say from what you have here, it sounds like the stream is getting cleaned up before you expect. This can be triggered a number of ways, see here https://github.com/joyent/node/blob/master/lib/stream.js#L89-112

A stream could emit 'end', and then something just stops.

Although I doubt this is the problem, one thing that concerns me is this https://github.com/peeinears/tweet-pipe/blob/master/index.js#L173-174 destroy should be called after emitting error.

I would normally debug a problem like this by adding logging statements until I can see what is not happening right.

Can you post a script that can be run to reproduce? (for extra points, include a package.json that specifies the dependencies :)

According to this, you should create an error handler on the stream created by tp.