Event Driven Processing Architecture Advice

I’m currently working on a project where I need to perform file processing in real-time. Here’s the workflow I’m trying to achieve

a. User requests a file to be processed to Server A (web)
b. Server A forwards the request to Server B
c. Server B finishes and signals Server A that it’s done
d. Server A signals the user that the file is ready (web)

Simple Event Arch

What am I looking for? I need to find a simple event driven way to communicate between Server A & Server B (ie steps b & c). FYI I’ll be using socket.io to communicate with the user on steps a & d via the web.

The environments that I’ll be using are Ubuntu 14.04 servers running node.js services (note the solution doesn’t have to be strictly node as long as there is an interface to it).

Seems simple enough? Here’s where it gets complicated. Each set of servers (consider them now web servers vs processing servers) are going to be replicated within a cloud (so many of each). The caveat is that when a processing server is completed processing a file it must signal to all web servers that the file is ready. Why? Each web server may have serviced a request that is waiting for the same file.

Cloud event arch

I need a solution that achieves this workflow in the fastest time possible (ie event driven rather than polling) that interfaces with node.js and runs on Ubuntu. Any thoughts?

One way to make your life easier might be to use redis for your communication back to the web servers about when the file is loaded and ready to go. Redis has built in channel subscription / messaging, and is quite easy to use with node.js

So when the file request comes in, if the web server has previously heard from redis that the file is loaded, it can get the file and return it, otherwise signal the backend to process the file, and wait for notification from a redis channel