Node.js Unable to Write JPEG Blob sent from Canvas toDataURL

I am sending a base64 encoded JPG string to a Node.js server and attempting to write the image to a file. The written file isn't showing any pixel data - just a blank image container. I'm thinking there's an error in my Node.js JavaScript as the blob generated from the browser looks to be the right size (8.1KB for a 640x480px image). I'm doing all of this locally, so I can rule out internet issues.

The base64 string is being generated via HTML5's Canvas.toDataURL upon clicking a button:

$("#sendImage").click(function() {

    // Send synchronous request to server
    var xhr = new XMLHttpRequest();

    var blob = dataURItoBlob(renderer.domElement.toDataURL('image/jpeg'));
    xhr.open('POST', 'http://localhost:3000/' + captureFrame, false);
    xhr.send(blob);
});

As you can see I'm sending a JPEG by specifying 'image/jpeg' in the toDataURL call. I've also tried omitting this string to send a PNG, but the file still results in a blank image.

Here's that dataURItoBlob function that removes the 'data:image/jpeg;base64,' portion of the generated base64 string:

function dataURItoBlob(dataURI) {
    var mimetype = dataURI.split(",")[0].split(':')[1].split(';')[0];
    var byteString = atob(dataURI.split(',')[1]);
    var u8a = new Uint8Array(byteString.length);
    for (var i = 0; i < byteString.length; i++) {
          u8a[i] = byteString.charCodeAt(i);
    }
    return new Blob([u8a.buffer], { type: mimetype });
}

As an alternative to this function, I've also tried the JavaScript split() method which looks like this:

blob = blob.split('data:image/jpeg;base64,');

But the same results occur.

Here is my Node.js server - It creates a buffer and fills it with the string chunks coming in from the request:

var port = 3000;
var http = require('http');
var fs = require('fs');

http.createServer(function(req, res) {
res.writeHead(200, {
    'Access-Control-Allow-Origin': '*',
    'Access-Control-Allow-Headers': 'Content-Type, X-Requested-With'
});
if (req.method === 'OPTIONS') {
    // Handle OPTIONS requests to work with jQuery and other libs that
    // cause preflighted CORS requests.
    res.end();
    return;
}

// Get capture frame from URL
var idx = req.url.split('/').pop();

// Create filename string
var filename = '/home/user/' + ('0000' + idx).slice(-5) + '.jpg';

// Create new buffer to write string chunks.
var img = new Buffer('');

req.on('data', function(chunk) {
    img = Buffer.concat([img, chunk]);
});

req.on('end', function() {

    // Write JPEG file to filesystem (writes blank image)
    var f = fs.writeFileSync(filename, img);
    console.log('Wrote ' + filename);
    res.end();
});

}).listen(port, '127.0.0.1');
console.log('Server running at http://127.0.0.1:' + port + '/');

The code above shows that the chunks are being written to a Buffer with default utf8 encoding. I'm aware that streaming the data will be a more efficient solution, but am currently unable to do this being a new Node developer.

The file is also being written synchronously with the writeFileSync method. I've also tried the asynchronouse version with writeFile, but with the same results!

The funny thing is that this code worked a few weeks ago. Now I've come back to the project, a bug has arisen! Perhaps node is doing everything asynchronously where the JavaScript is being executed out of order (unlikely)? Or is this approach unstable and streaming is likely to fix the issue?

All help is appreciated!

The solution seems to be passing the preserverDrawingBuffer flag when instantiating the Three.js WebGL renderer:

renderer = new THREE.WebGLRenderer({ antialias: true, preserveDrawingBuffer  : true });

The solution is unrelated to Node, but I'll keep the solution up for any one with a similar setup and encountering the same problem.