First of all; understand following factors involves in each WebRTC application:
- Signalling servers (mandatory)
- ICE servers (mandatory)
- Media servers (optional)
E.g. Socket.io, WebSockets, XMPP, SIP or simplest one i.e. XHR.
A gateway or web service which is capable to exchange some kind of data (e.g. text message) among two or more users.
You can use realtime protocols like WebSockets to exchange data as quickly as possible; or simple POST/GET mechanism to POST to server and share with relevant users.
It is mandatory part of WebRTC; because WebRTC uses Offer/Answer model to setup peer connection. One's offer must be shared with other; and vice versa.
E.g. STUN or TURN servers.
WebRTC is bound to use ICE servers because we need to make sure users' Firewalls MUST NOT block UDP or TCP ports.
A simple ICE server (e.g. STUN) tracks public IP addresses of the NAT. It makes some connectivity checks to fetch working ports e.g. UDP or TCP.
We need to fallback to relaying ICE server (i.e. TURN) to traverse NATs like symmetric. Such systems hide public ip addresses from external networks; so TURN server comes into action and generates random public ip addresses; and relays incoming stream over private ports of the relevant network.
Google played a good role to provide public STUN server. You don't need to setup your own ICE server for testing purpose. However, it is suggested to setup custom STUN/TURN servers for commercial uses.
You can install STUN on Node.js as well. It is too easy to install and use it! However, always keep in mind symmetric NATs!
E.g. Asterisk or Freeswitch ... or 3rd party media servers like Kamailio; BigBlueButton etc.
A service capable to capture and process RTP packets e.g. transcode, record or merge them on the server end.
A simple media server is capable to handle huge bandwidth comparing a desktop client.
Media servers comes with stream processing native libraries; same like ffmpeg.
You can use them for relaying stream over thousand of peers.
Media Servers are useful in commerical apps; however it is not mandatory. You can interconnect 5 users using peer-to-peer model without worrying about media servers.
E.g. RTCPeerConnection, RTCDataChnannel or SCTP Data Channels, Screen Sharing, Desktop Sharing, etc.
You just need to understand offer/answer model.
WebRTC Offer/Answer model
Offer/Answer is a well-known signalling model used on the realtime web since last two decades.
One peer creates offer; and other creates answer.
Offers/Answers are exchanged using signalling gateways like WebSockets/SIP/XMPP/etc.
In WebRTC words; offer/answer is a session-description of the user; which contains info like audio/video codecs; system resolutions; bandwidth; cryptographic-keys; media devices; etc.
If user-A creates offer; his session-description is known as "local session description". When we share his offer with other user; first user's "local session description" becomes "remote session description" for second user.
So, offer/answer model is a process of exchanging local/remote session descriptions.
You also need to exchange one's local ICE candidates with other; and vice versa.
WebRTC ICE Candidates
Browsers provide API to setup ip/ports that can be used to fetch ICE candidates for the user.
WebRTC developers can't manually fetch ICE candidates. However, you can easily customize your own ICE server to generate appropriate ports and "public" IP addresses.
Only signalling section is left to application developer. Media establishment or ICE connectivity checks is performed by the browser itself.
Want to learn more about WebRTC?
You can find many tutorials here: https://www.webrtc-experiment.com/docs/
If you want to write your first WebRTC application; follow this document:
This one is useful too:
There are some other documents like WebRTC for Beginners and WebRTC for Newbies.
A few things about WebRTC
WebRTC is a realtime communication engine; primarily works in the context of the browser. Though, there are native APIs as well.
It is not just a collection of API; it an innovative framework capable to define new protocols; suggest new gateways; etc. Remember, WebRTC isn't a service in itself! Services can use WebRTC API to compete the world!
WebRTC is not a sub-part of browsers. Browsers implemented WebRTC native API to take advantage of its media engines and deliver media/data streaming using WebRTC-standard protocols.
It is not Google's WebRTC! It is yours!!!
Think about WebRTC as a media streaming engine primarily developed for web developers.
- Media Communication (RTCPeerConnection API)
- Data Communication (RTCDataChannel/SCTP-datachannel API)
- Media Capturing (getUserMedia API)
There is a separate draft/specification for Media Processing API as well.
WebAudio API has a separate specification as well; and can be used in WebRTC applications to process/initialize media streams.