How can I use the getUserMedia() API with Node.js to stream media in real time?

Learning how to capture video and audio in my web browser took a simple google search, resulting in the simplest example of a working test, which plays my own video and audio back to me:

navigator.getUserMedia = (navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia || navigator.msGetUserMedia);

if (navigator.getUserMedia) {
  navigator.getUserMedia(

    // constraints
    {
      video: true,
      audio: true
    },

    // successCallback
    function(localMediaStream) {
      var video = document.querySelector('video');
      video.src = window.URL.createObjectURL(localMediaStream);
      // Do something with the video here, e.g. video.play()
    },

    // errorCallback
    function(err) {
      console.log("The following error occured: " + err);
    });
} else {
  console.log("getUserMedia not supported");
}
video {
  width: 100%;
  height: 420px;
  background-color: black;
}
<video autoplay></video>

but from there, the search criteria becomes tricky. Searching Stream audio & video with Node.js shows tutorials to stream video from the server to the client, but my question, how to stream audio and video from one client, to the server, to another client, is nowhere to be found. ]

I want to learn the fundamental steps of doing this. I'd like to really understand it in a simple way before considering the use of additional APIs to speed up the process. I'm open to Socket.io, but that's a common, standard API.

Hence my question:

How, in the simplest possible form, can I stream video and audio data from one client to another through a Node.js server?

Depending on your needs, the best way may be to use WebRTC, with directly streams audio and video from one client to another, using your node server only to establish the connection by sending some mandatory information.

I can suggest a good tutorial to get started with webRTC here : http://www.html5rocks.com/en/tutorials/webrtc/basics/