Recently I've been stumbling upon a lot of benchmarks between Node.js and Clojure, such as this, and this, and this. It seems to me, that compared to languages like Ruby, both Node.js and Clojure are about equally fast (which means a lot faster).
The question is, how does Clojure compare to Node.js in terms of RAM consumption? Say that I was about to write a simple live chat app.
If I was about to compare Rails vs Node.js, I can basically expect Node.js to be 100 times faster and take 10 times less memory than Rails ... but how does Clojure fit in here?
How would Clojure compare here in terms of memory consumption? Can I expect it to take a lot more memory than Node.js, because it is running on the JVM? Or is this just a stereotype that isn't true anymore?
For a simple application on modern hardware, you should have no memory usage issues with either Node.js or Clojure.
Of course, as Niklas points out it will ultimately depend on what frameworks you use and how well written your app is.
Clojure has quite a significant base memory requirement (because the java runtime environment / JVM is pretty large), but I've found it to be pretty memory efficient beyond than point - Clojure objects are just Java objects under the hood so that probably shouldn't be too surprising.
It's also worth noting that directly measuring the memory usage of a JVM app is usually misleading, since the JVM typically pre-allocates more memory than it needs and only garbage collects in a lazy (as needed) fashion. So while the apparent total memory usage looks high, the actual working set can be quite small (which is what you really care about for performance purposes).