Why Is Video Streaming Continued Although Node Server Is Closed

The video streaming is starting after WebRTC clients exchanged messages with each other. The messages operation is coninued on node server which uses 1337 port and web socket. Video streaming is continued after i closed the server on 1337 port. Why ? Which ports are used by webrtc clients. The node server uses one port(1337). How can i learn or control or change these ports which are used by web rtc server and clients?

The Node.js server is only used for session start up and tear down. So, once a session is started(unless media or networking is changed) it will continue until one of the peers end it. Remember, WebRTC is a peer to peer connection.

Once the needed Ice Candidates and SDP information is exchanged for the Peer connection then there is no longer a need for the signalling server(barring any unforeseen network/media changes). One of the peers in the session(the peerconnection itself) would have to end the session and stop streaming.

Edit:

  1. For Chrome, by default, the media is muxed on the same port(only one media port is used). In FireFox, I believe that audio and video use separate ports...unless it has been changed in a recent update.
  2. You cannot specify media ports in the WebRTC client and still guarantee a media connection can be made(not in any browser implementation, you may be able to do this with the native API). You could modify the ports in the SDP but that would break NAT traversal for the media.
  3. The signalling server has nothing to do with what media ports are used by the clients.
  4. You most definitely cannot modify the media stream at all after the signalling server is offline as there is no way to exchange media set up information(ports, media type, stream ID, etc..)

I foresee two ways of controlling the media ports in the current javascript APIs:

  1. You would have to control the ICE server(s) and what ports they are allowed to try and set up for NAT traversal(STUN and TURN). This would be done server side and cannot be done by the WebRTC(Client side) part of the system. So, you cannot do it simply in WebRTC alone, only by manipulating how the ICE servers gather candidates.
  2. Another, very unlikely and almost unmentionable case, is if you knew for a fact what ports are available for NAT traversal(set up port forwarding or something on both peer ends), they you could modify the media port in the SDP before setting it locally and sending it to the peer. You would not need ICE servers in that case.

Here is a deeper discussion about port allocation for the media