How to Develop Your First WebRTC Application – Part 2

How to Develop Your First WebRTC Application – Part 2

WebRTC is the next big thing in the field of internet based communication. Bridging the gap between software based video calling applications and completely web browser based web app, WebRTC is a flexible API with amazing possibilities.

In my previous post I talked about the underlying theory that goes behind any WebRTC app. I also highlighted the uses of all the components of the webRTC API such as Media Stream, RTCPeerConnection, and RTCDataChannel. In this post we will look into actual coding that goes behind a webRTC app. In this example we are going to develop a simple Video calling web app using WebRTC, node.JS and Socket.io.

I am assuming that you have sound knowledge of JavaScript, HTML and CSS, and Node.JS. If not then this tutorial may seem a little advanced to you.

Let the Coding BEGIN!

Ok enough with the backgrounds and theory. Let’s get started with the coding part.

With that out of the way let’s get started.

Step 1: Create a blank HTML5 document

  • Create a bare-bones HTML document.
<script type="text/javascript" src="js/lib/adapter.js"></script><script type="text/javascript">// <!&#91;CDATA&#91;

// &#93;&#93;></script>

Step 2: Get video from your webcam

  • Add a video element to your page.
  • Add the following JavaScript to the script element on your page, to enable getUserMedia() to set the source of the video from the web cam:
var constraints = {video: true};

function successCallback(localMediaStream) {
window.stream = localMediaStream; // stream available to console
var video = document.querySelector("video");
video.src = window.URL.createObjectURL(localMediaStream);
video.play();
}

function errorCallback(error){
console.log("navigator.getUserMedia error: ", error);
}

navigator.getUserMedia(constraints, successCallback, errorCallback);
  • Test it out locally

So what exactly did we do here? We first called getUserMedia using:

navigator.getUserMedia(constraints, successCallback, errorCallback);

The constraints argument allows us to specify the media to get, in this case video only:

var constraints = {"video": true}

If successful, the video stream from the webcam is set as the source of the video element:

function successCallback(localMediaStream) {
window.stream = localMediaStream; // stream available to console
var video = document.querySelector("video");
video.src = window.URL.createObjectURL(localMediaStream);
video.play();
}

Step 3: Set up a signaling server and exchange messages

In a real world application, the sender and receiver RTCPeerConnections are not on the same page, and we need a way for them to communicate metadata.
For this, we use a signaling server: a server that can exchange messages between a WebRTC app (client) running in one browser and a client in another browser. The actual messages are stringified JavaScript objects.

In this step we’ll build a simple Node.js signaling server, using the socket.io Node module and JavaScript library for messaging. Experience of Node.js and socket.io will be useful, but not crucial — the messaging components are very simple. In this example, the server (the Node app) is server.js and the client (the web app) is index.html.
The Node server application in this step has two tasks.

    • To act as a messaging intermediary:
socket.on('message', function (message) {
log('Got message: ', message);
socket.broadcast.emit('message', message);
});
    • and to manage WebRTC video chat ‘rooms’:
if (numClients == 0){
socket.join(room);
socket.emit('created', room);
} else if (numClients == 1) {
io.sockets.in(room).emit('join', room);
socket.join(room);
socket.emit('joined', room);
} else { // max two clients
socket.emit('full', room);
}

Our simple WebRTC application will only permit a maximum of two peers to share a room. Ensure you have Node, socket.io and node-static installed. To install socket.io and node-static, run Node Package Manager from a terminal in your application directory:

npm install socket.io
npm install node-static

You would now have three seperate files, index.html, server.js, and your main javascript file main.js. They would look somthing like this.

    • main.js
var static = require('node-static');
var http = require('http');
var file = new(static.Server)();
var app = http.createServer(function (req, res) {
file.serve(req, res);
}).listen(2013);

var io = require('socket.io').listen(app);

io.sockets.on('connection', function (socket){

// convenience function to log server messages on the client
function log(){
var array = [">>> Message from server: "];
for (var i = 0; i < arguments.length; i++) {
array.push(arguments&#91;i&#93;);
}
socket.emit('log', array);
}

socket.on('message', function (message) {
log('Got message:', message);
// for a real app, would be room only (not broadcast)
socket.broadcast.emit('message', message);
});

socket.on('create or join', function (room) {
var numClients = io.sockets.clients(room).length;

log('Room ' + room + ' has ' + numClients + ' client(s)');
log('Request to create or join room ' + room);

if (numClients === 0){
socket.join(room);
socket.emit('created', room);
} else if (numClients === 1) {
io.sockets.in(room).emit('join', room);
socket.join(room);
socket.emit('joined', room);
} else { // max two clients
socket.emit('full', room);
}
socket.emit('emit(): client ' + socket.id + ' joined room ' + room);
socket.broadcast.emit('broadcast(): client ' + socket.id + ' joined room ' + room);

});

});
&#91;/php&#93;

<ul class="micro-list">
<ul class="micro-list">
	<li>index.html</li>
</ul>
</ul>



WebRTC client

<script type="text/javascript" src="http://localhost:2013/socket.io/socket.io.js"></script><script type="text/javascript" src="js/lib/adapter.js"></script>
<script type="text/javascript" src="js/main.js"></script>

    • server.js
var isInitiator;

room = prompt("Enter room name:");

var socket = io.connect();

if (room !== "") {
console.log('Joining room ' + room);
socket.emit('create or join', room);
}

socket.on('full', function (room){
console.log('Room ' + room + ' is full');
});

socket.on('empty', function (room){
isInitiator = true;
console.log('Room ' + room + ' is empty');
});

socket.on('join', function (room){
console.log('Making request to join room ' + room);
console.log('You are the initiator!');
});

socket.on('log', function (array){
console.log.apply(console, array);
});

To start the server, run the following command from a terminal in your application directory:

node server.js

Step 4: Connecting everything

Now we are going to connect everything. We are going to add signalling to our video client created in step 2. We are also going to implement RTCPeerConnection and RTCDataChannel to our app.

    • add the following code in your main html file
</pre>
<div id="container">
<div id="videos">
</div>
<div id="textareas"><textarea id="dataChannelSend" disabled="disabled"></textarea>
<textarea id="dataChannelReceive" disabled="disabled"></textarea></div>
<button id="sendButton" disabled="disabled">Send</button></div>
<pre>
    • Your main JavaScript file, main.js, would look something like this
'use strict';

var sendChannel;
var sendButton = document.getElementById("sendButton");
var sendTextarea = document.getElementById("dataChannelSend");
var receiveTextarea = document.getElementById("dataChannelReceive");

sendButton.onclick = sendData;

var isChannelReady;
var isInitiator;
var isStarted;
var localStream;
var pc;
var remoteStream;
var turnReady;

var pc_config = webrtcDetectedBrowser === 'firefox' ?
{'iceServers':[{'url':'stun:23.21.150.121'}]} : // number IP
{'iceServers': [{'url': 'stun:stun.l.google.com:19302'}]};

var pc_constraints = {
'optional': [
{'DtlsSrtpKeyAgreement': true},
{'RtpDataChannels': true}
]};

// Set up audio and video regardless of what devices are present.
var sdpConstraints = {'mandatory': {
'OfferToReceiveAudio':true,
'OfferToReceiveVideo':true }};

/////////////////////////////////////////////

var room = location.pathname.substring(1);
if (room === '') {
// room = prompt('Enter room name:');
room = 'foo';
} else {
//
}

var socket = io.connect();

if (room !== '') {
console.log('Create or join room', room);
socket.emit('create or join', room);
}

socket.on('created', function (room){
console.log('Created room ' + room);
isInitiator = true;
});

socket.on('full', function (room){
console.log('Room ' + room + ' is full');
});

socket.on('join', function (room){
console.log('Another peer made a request to join room ' + room);
console.log('This peer is the initiator of room ' + room + '!');
isChannelReady = true;
});

socket.on('joined', function (room){
console.log('This peer has joined room ' + room);
isChannelReady = true;
});

socket.on('log', function (array){
console.log.apply(console, array);
});

////////////////////////////////////////////////

function sendMessage(message){
console.log('Sending message: ', message);
socket.emit('message', message);
}

socket.on('message', function (message){
console.log('Received message:', message);
if (message === 'got user media') {
maybeStart();
} else if (message.type === 'offer') {
if (!isInitiator && !isStarted) {
maybeStart();
}
pc.setRemoteDescription(new RTCSessionDescription(message));
doAnswer();
} else if (message.type === 'answer' && isStarted) {
pc.setRemoteDescription(new RTCSessionDescription(message));
} else if (message.type === 'candidate' && isStarted) {
var candidate = new RTCIceCandidate({sdpMLineIndex:message.label,
candidate:message.candidate});
pc.addIceCandidate(candidate);
} else if (message === 'bye' && isStarted) {
handleRemoteHangup();
}
});

////////////////////////////////////////////////////

var localVideo = document.querySelector('#localVideo');
var remoteVideo = document.querySelector('#remoteVideo');

function handleUserMedia(stream) {
localStream = stream;
attachMediaStream(localVideo, stream);
console.log('Adding local stream.');
sendMessage('got user media');
if (isInitiator) {
maybeStart();
}
}

function handleUserMediaError(error){
console.log('getUserMedia error: ', error);
}

var constraints = {video: true};

getUserMedia(constraints, handleUserMedia, handleUserMediaError);
console.log('Getting user media with constraints', constraints);

if (location.hostname != "localhost") {
requestTurn('https://computeengineondemand.appspot.com/turn?username=41784574&key=4080218913');
}

function maybeStart() {
if (!isStarted && localStream && isChannelReady) {
createPeerConnection();
pc.addStream(localStream);
isStarted = true;
if (isInitiator) {
doCall();
}
}
}

window.onbeforeunload = function(e){
sendMessage('bye');
}

/////////////////////////////////////////////////////////

function createPeerConnection() {
try {
pc = new RTCPeerConnection(pc_config, pc_constraints);
pc.onicecandidate = handleIceCandidate;
console.log('Created RTCPeerConnnection with:n' +
' config: '' + JSON.stringify(pc_config) + '';n' +
' constraints: '' + JSON.stringify(pc_constraints) + ''.');
} catch (e) {
console.log('Failed to create PeerConnection, exception: ' + e.message);
alert('Cannot create RTCPeerConnection object.');
return;
}
pc.onaddstream = handleRemoteStreamAdded;
pc.onremovestream = handleRemoteStreamRemoved;

if (isInitiator) {
try {
// Reliable Data Channels not yet supported in Chrome
sendChannel = pc.createDataChannel("sendDataChannel",
{reliable: false});
sendChannel.onmessage = handleMessage;
trace('Created send data channel');
} catch (e) {
alert('Failed to create data channel. ' +
'You need Chrome M25 or later with RtpDataChannel enabled');
trace('createDataChannel() failed with exception: ' + e.message);
}
sendChannel.onopen = handleSendChannelStateChange;
sendChannel.onclose = handleSendChannelStateChange;
} else {
pc.ondatachannel = gotReceiveChannel;
}
}

function sendData() {
var data = sendTextarea.value;
sendChannel.send(data);
trace('Sent data: ' + data);
}

function gotReceiveChannel(event) {
trace('Receive Channel Callback');
sendChannel = event.channel;
sendChannel.onmessage = handleMessage;
sendChannel.onopen = handleReceiveChannelStateChange;
sendChannel.onclose = handleReceiveChannelStateChange;
}

function handleMessage(event) {
trace('Received message: ' + event.data);
receiveTextarea.value = event.data;
}

function handleSendChannelStateChange() {
var readyState = sendChannel.readyState;
trace('Send channel state is: ' + readyState);
enableMessageInterface(readyState == "open");
}

function handleReceiveChannelStateChange() {
var readyState = sendChannel.readyState;
trace('Receive channel state is: ' + readyState);
enableMessageInterface(readyState == "open");
}

function enableMessageInterface(shouldEnable) {
if (shouldEnable) {
dataChannelSend.disabled = false;
dataChannelSend.focus();
dataChannelSend.placeholder = "";
sendButton.disabled = false;
} else {
dataChannelSend.disabled = true;
sendButton.disabled = true;
}
}

function handleIceCandidate(event) {
console.log('handleIceCandidate event: ', event);
if (event.candidate) {
sendMessage({
type: 'candidate',
label: event.candidate.sdpMLineIndex,
id: event.candidate.sdpMid,
candidate: event.candidate.candidate});
} else {
console.log('End of candidates.');
}
}

function doCall() {
var constraints = {'optional': [], 'mandatory': {'MozDontOfferDataChannel': true}};
// temporary measure to remove Moz* constraints in Chrome
if (webrtcDetectedBrowser === 'chrome') {
for (var prop in constraints.mandatory) {
if (prop.indexOf('Moz') !== -1) {
delete constraints.mandatory[prop];
}
}
}
constraints = mergeConstraints(constraints, sdpConstraints);
console.log('Sending offer to peer, with constraints: n' +
' '' + JSON.stringify(constraints) + ''.');
pc.createOffer(setLocalAndSendMessage, null, constraints);
}

function doAnswer() {
console.log('Sending answer to peer.');
pc.createAnswer(setLocalAndSendMessage, null, sdpConstraints);
}

function mergeConstraints(cons1, cons2) {
var merged = cons1;
for (var name in cons2.mandatory) {
merged.mandatory[name] = cons2.mandatory[name];
}
merged.optional.concat(cons2.optional);
return merged;
}

function setLocalAndSendMessage(sessionDescription) {
// Set Opus as the preferred codec in SDP if Opus is present.
sessionDescription.sdp = preferOpus(sessionDescription.sdp);
pc.setLocalDescription(sessionDescription);
sendMessage(sessionDescription);
}

function requestTurn(turn_url) {
var turnExists = false;
for (var i in pc_config.iceServers) {
if (pc_config.iceServers[i].url.substr(0, 5) === 'turn:') {
turnExists = true;
turnReady = true;
break;
}
}
if (!turnExists) {
console.log('Getting TURN server from ', turn_url);
// No TURN server. Get one from computeengineondemand.appspot.com:
var xhr = new XMLHttpRequest();
xhr.onreadystatechange = function(){
if (xhr.readyState === 4 && xhr.status === 200) {
var turnServer = JSON.parse(xhr.responseText);
console.log('Got TURN server: ', turnServer);
pc_config.iceServers.push({
'url': 'turn:' + turnServer.username + '@' + turnServer.turn,
'credential': turnServer.password
});
turnReady = true;
}
};
xhr.open('GET', turn_url, true);
xhr.send();
}
}

function handleRemoteStreamAdded(event) {
console.log('Remote stream added.');
// reattachMediaStream(miniVideo, localVideo);
attachMediaStream(remoteVideo, event.stream);
remoteStream = event.stream;
// waitForRemoteVideo();
}
function handleRemoteStreamRemoved(event) {
console.log('Remote stream removed. Event: ', event);
}

function hangup() {
console.log('Hanging up.');
stop();
sendMessage('bye');
}

function handleRemoteHangup() {
console.log('Session terminated.');
stop();
isInitiator = false;
}

function stop() {
isStarted = false;
// isAudioMuted = false;
// isVideoMuted = false;
pc.close();
pc = null;
}

///////////////////////////////////////////

// Set Opus as the default audio codec if it's present.
function preferOpus(sdp) {
var sdpLines = sdp.split('rn');
var mLineIndex;
// Search for m line.
for (var i = 0; i < sdpLines.length; i++) {
if (sdpLines&#91;i&#93;.search('m=audio') !== -1) {
mLineIndex = i;
break;
}
}
if (mLineIndex === null) {
return sdp;
}

// If Opus is available, set it as the default in m line.
for (i = 0; i < sdpLines.length; i++) {
if (sdpLines&#91;i&#93;.search('opus/48000') !== -1) {
var opusPayload = extractSdp(sdpLines&#91;i&#93;, /:(d+) opus/48000/i);
if (opusPayload) {
sdpLines&#91;mLineIndex&#93; = setDefaultCodec(sdpLines&#91;mLineIndex&#93;, opusPayload);
}
break;
}
}

// Remove CN in m line and sdp.
sdpLines = removeCN(sdpLines, mLineIndex);

sdp = sdpLines.join('rn');
return sdp;
}

function extractSdp(sdpLine, pattern) {
var result = sdpLine.match(pattern);
return result && result.length === 2 ? result&#91;1&#93; : null;
}

// Set the selected codec to the first in m line.
function setDefaultCodec(mLine, payload) {
var elements = mLine.split(' ');
var newLine = &#91;&#93;;
var index = 0;
for (var i = 0; i < elements.length; i++) { if (index === 3) { // Format of media starts from the fourth. newLine&#91;index++&#93; = payload; // Put target payload to the first. } if (elements&#91;i&#93; !== payload) { newLine&#91;index++&#93; = elements&#91;i&#93;; } } return newLine.join(' '); } // Strip CN from sdp before CN constraints is ready. function removeCN(sdpLines, mLineIndex) { var mLineElements = sdpLines&#91;mLineIndex&#93;.split(' '); // Scan from end for the convenience of removing an item. for (var i = sdpLines.length-1; i >= 0; i--) {
var payload = extractSdp(sdpLines[i], /a=rtpmap:(d+) CN/d+/i);
if (payload) {
var cnPos = mLineElements.indexOf(payload);
if (cnPos !== -1) {
// Remove CN payload from m line.
mLineElements.splice(cnPos, 1);
}
// Remove CN line in sdp
sdpLines.splice(i, 1);
}
}

sdpLines[mLineIndex] = mLineElements.join(' ');
return sdpLines;
}
  • Done!!

Your First WebRTC application complete

Alright. You are now a proud owner of a brand new WebRTC video-chat app. It may seem a little rudimentary for but you now have the basic essence of the app. You can start your own experimenting by first adding your own CSS to make the page more appealing. in addition, the app supports only one-on-one video chatting so you can take it one step ahead and add conference chat options, or if you fancy it create a Mobile app out of it, like we did.

The following two tabs change content below.
Rachit Agarwal

Rachit Agarwal

Director and Co-Founder at Algoworks Technologies
Rachit is leading the mobility business development function, mobility strategy and consulting practice at Algoworks. He is an expert of all mobile technologies and has experience in managing teams involved in the development of custom iPhone/iPad/Android apps.
Rachit Agarwal

Latest posts by Rachit Agarwal (see all)

Rachit AgarwalHow to Develop Your First WebRTC Application – Part 2
  • Shirish

    Awsome article !! I couldn’t find link to part 1 so note to author : It wouldn’t hurt to put the link to part 1 of the series prior to commencing on part 2. :)

  • Oihi
  • muralidharreddy challa

    is there a way of using this setup for one to many instead of one to one.

  • Omar Shrbaji

    I didn’t read the whole code but I tried to make one to one video call and it worked for me, the problem is when I tried to make a conference video call I couldn’t do it. please can any one help me?