Issues with streams and S3 upload

I am having issues with a zero byte stream. I am resizing an image and uploading it as a stream to S3. If I pipe output to response it displays correctly.

// Fetch remote file
var request = http.get('mybigfile.jpg', function(response) {

  // Setup IM convert
  var convert = spawn('convert', ['-', '-strip', '-thumbnail', 600, '-']);

  // Pipe file stream to it
  response.pipe(convert.stdin);

  // Pipe result to browser - works fine
  //convert.stdout.pipe(res);


  // S3 requires headers
  var headers = {
    'content-type': response.headers['content-type'],
    'x-amz-acl': 'public-read'
   };

  // Upload to S3
  var aws = aws2js.load('s3', aws.key, aws.secret);
  aws.setBucket(aws.bucket);
  aws.putStream('thumb.jpg', convert.stdout, false, headers, function(err) {
    if (err) {
      return console.error('Error storing:', err.toString());
    } else {
      // No errors - this shows - but file is 0kb
      console.log(path + ' uploaded to S3');
    }
  }

I see notes about streams not working with S3 due to content length. I am trying buffers but no success with that so far.

Well no go on streams - I guess I can use pause-stream or multipart to technically achieve this but otherwise I don't think it's possible. I ended up using a buffer.

...
// Pipe file stream to it
response.pipe(convert.stdin);

// Save to buffer
var bufs = [] ;
convert.stdout.on('data', function(chunk) {
  bufs.push(chunk);
});
convert.stdout.on('end', function() {
  var buffer = Buffer.concat(bufs);

// S3 requires headers
...

aws.putBuffer(path, buffer, false, headers, function(err) {
...