Solution/Architecture: queues or something else?

I have a multiple frontends to my service written in Node.js and workers written in Ruby. Now the question is how to make those communicate? I need to maintain dynamic pool of workers to handle load (spawn more workers when load rises) and messages are quite big ~2-3M because I'm sending images to workers uploaded by users through Node.js frontends. Because I want nice scaling I thought about some queuing solution, but I didn't find any existing solutions (or misunderstood guides) that will provide:

  1. Fallback mechanisms. Solutions I've found so far have single failure point - message broker and there are no ways to provide fallbacks.
  2. Serialization. So when broker fails tasks are not lost.
  3. Ability to pass big messages.
  4. Easy API for Ruby and Node.js
  5. Some API to track queue size so I could rearrange workers pool.
  6. Preferrably lightweight.

Maybe my approach is wrong? Maybe I shouldn't use queues but some other way? Or there's some queueing solution that fits requirements above?

No doubt you require a Queue to scale and you can monitor this queue to spawn "workers".

Apache ActiveMQ is very robust and supports REST protocol. Ruby client is also available to access the queue.

Interesting article on RESTful queue using Apache ActiveMQ

in the end of the day i took ZeroMQ queue solution. Very fast, robust and lightweight implementation. Had to write own broker, but thats the only cons of this solution.

redis publish/subscribe should do the trick

http://redis.io/topics/pubsub