Upload data to S3 via POST using AWS SDK for Node.js/Restify

I'm trying to figure out how to upload data to an Amazon S3 bucket via a RESTful API that I'm writing in Node.js/Restify. I think I've got the basic concepts all working, but when I go to connect to the body of my POST request, that's when things go awry. When I set up my callback function to simply pass a string to S3, it works just fine and the file is created in the appropriate S3 bucket:

function postPoint(req, res, next) {

  var point = [
    { "x": "0.12" },
    { "y": "0.32" }
  ];

  var params = { Bucket: 'myBucket', Key: 'myKey', Body: JSON.stringify(point) };

  s3.client.putObject(params, function (perr, pres) {
    if (perr) {
        console.log("Error uploading data: ", perr);
    } else {
        console.log("Successfully uploaded data to myBucket/myKey");
    }
  });

  res.send(200);
  return next();
}

server.post('/point', postPoint);

Obviously, I need to eventually stream/pipe my request from the body of the request. I assumed all that I needed to do would be to simply switch the body of the params to the request stream:

function postPoint(req, res, next) {

  var params = { Bucket: 'myBucket', Key: 'myKey', Body: req };

  s3.client.putObject(params, function (perr, pres) {
    if (perr) {
        console.log("Error uploading data: ", perr);
    } else {
        console.log("Successfully uploaded data to myBucket/myKey");
    }
  });

  res.send(200);
  return next();
}

But that ends up causing the following log message to be displayed: "Error uploading data: [TypeError: path must be a string]" which gives me very little indication of what I need to do to fix the error. Ultimately, I want to be able to pipe the result since the data being sent could be quite large (I'm not sure if the previous examples are causing the body to be stored in memory), so I thought that something like this might work:

function postPoint(req, res, next) {

  var params = { Bucket: 'myBucket', Key: 'myKey', Body: req };

  req.pipe(s3.client.putObject(params));

  res.send(200);
  return next();
}

Since I've done something similar in a GET function that works just fine:(s3.client.getObject(params).createReadStream().pipe(res);). But that also did not work.

I'm at a bit of a loss at this point so any guidance would be greatly appreciated!

So, I finally discovered the answer after posting on the AWS Developer Forums. It turns out that the Content-Length header was missing from my S3 requests. Loren@AWS summed it up very well:

In order to upload any object to S3, you need to provide a Content-Length. Typically, the SDK can infer the contents from Buffer and String data (or any object with a .length property), and we have special detections for file streams to get file length. Unfortunately, there's no way the SDK can figure out the length of an arbitrary stream, so if you pass something like an HTTP stream, you will need to manually provide the content length yourself.

The suggested solution was to simply pass the content length from the headers of the http.IncomingMessage object:

var params = {
  Bucket: 'bucket', Key: 'key', Body: req,
  ContentLength: parseInt(req.headers['content-length'], 10)
};
s3.putObject(params, ...);

If anyone is interested in reading the entire thread, you can access it here.