For my website, I am deploying all the assets (fonts/images/js etc) to an S3 bucket. The index.html (a single-page Ember.js application), is deployed on an elastic beanstalk node.js server. The node app.js takes any request to www.domain.com/* and serves the locally stored index.html. I would like to be able to cut out the process of deploying a new application to elastic beanstalk for every production build and simply deploy all assets and the index.html to the S3 bucket.
This is what I have so far:
var AWS = require('aws-sdk'),
fs = require('fs');
/*
* AWS Security credentials
*/
AWS.config.loadFromPath('./config.json');
var port = process.env.PORT || 3000,
http = require("http");
var static = require('node-static');
/*
* Create a node-static server instance
* to serve the './public' folder
*/
var file = new static.Server();
/*
* Fetch .index.html from S3
* and cache it locally
*/
var s3 = new AWS.S3();
var params = {Bucket: 'assets', Key: 'index.html'};
var file = fs.createWriteStream('index.html');
s3.getObject(params).
on('httpData', function(chunk) { file.write(chunk); }).
on('httpDone', function() { file.end(); }).
send();
var server = http.createServer(function (request, response) {
request.addListener('end', function () {
file.serveFile('index.html', 200, {}, request, response);
}).resume();
}).listen(port);
This I assume will only get the index.html from S3 when the server first fires up. What would be the best practice for caching, preferably with a 1 minute expiry.
Thanks!
Have a look at Amazon's CloudFront. Sounds familiar for what you're trying to accomplish, namely that the files wouldn't have to go through your server again. Adds a little to the round-trip of your full page load.
That said, to cache locally, you could store the entire file in Redis (or other quick thing like that, Raik, memcache, etc.).
I am unsure how this would respond if the files were large, but it would still be faster than pulling from S3 each time.