WebRTC Signalling and SDKs

WebRTC Publish & Play JavaScript SDK

Ant Media Server provides WebSocket interface in publishing and playing WebRTC streams. In this document, we will show both how to publish and play WebRTC streams by using JavaScript SDK.

How to Publish WebRTC stream with JavaScript SDK

Let’s see how to make it step by step

  1. Load the below scripts in head element of the html file
<head>
...
<script src="https://webrtc.github.io/adapter/adapter-latest.js"></script>
<script src="js/webrtc_adaptor.js" ></script>
...
</head>
  1. Create local video element somewhere in the body tag
<video id="localVideo" autoplay muted width="480"></video>
  1. Initialize the WebRTCAdaptor object in script tag
var pc_config = null;

var sdpConstraints = {
    OfferToReceiveAudio : false,
    OfferToReceiveVideo : false

};
var mediaConstraints = {
    video : true,
    audio : true
};

var webRTCAdaptor = new WebRTCAdaptor({
    websocket_url : "ws://" + location.hostname + ":8081/WebRTCAppEE",
    mediaConstraints : mediaConstraints,
    peerconnection_config : pc_config,
    sdp_constraints : sdpConstraints,
    localVideoId : "localVideo",
    callback : function(info) {
        if (info == "initialized")
                    {
            console.log("initialized");

        }
                    else if (info == "publish_started")
                    {
            //stream is being published
            console.log("publish started");
        }
                    else if (info == "publish_finished")
                    {
            //stream is finished
            console.log("publish finished");
        }
                    else if (info == "screen_share_extension_available")
                    {
                            //screen share extension is avaiable
            console.log("screen share extension available");
        }
                    else if (info == "screen_share_stopped")
                    {
                             //"Stop Sharing" is clicked in chrome screen share dialog
            console.log("screen share stopped");
        }

    },
    callbackError : function(error) {
        //some of the possible errors, NotFoundError, SecurityError,PermissionDeniedError

        console.log("error callback: " + error);
        alert(error);
    }
});
  1. Call publish(streamName) to Start Publishing

In order to publish WebRTC stream to Ant Media Server, WebRTCAdaptor’s publish(streamName) function should be called. You can choose the call this function in success callback function when the info parameter is having value “initialized”

if (info == "initialized")
{
 // it is called with this parameter when it connects to
 // Ant Media Server and everything is ok
 console.log("initialized");
 webRTCAdaptor.publish("stream1");
}
  1. Call stop() to Stop Publishing

You may want to stop publishing anytime by calling stop function of WebRTCAdaptor

webRTCAdaptor.stop()

Sample

Please take a look at the WebRTCAppEE/index.html file in order to see How JavaScript SDK can be used for publishing a stream

How to Play WebRTC stream with JavaScript SDK

  1. Load the below scripts in head element of the html file
<head>
...
<script src="https://webrtc.github.io/adapter/adapter-latest.js"></script>
<script src="js/webrtc_adaptor.js" ></script>
...
</head>
  1. Create remote video element somewhere in the body tag
<video id="remoteVideo" autoplay controls></video>
  1. Initialize the WebRTCAdaptor object like below in script tag
var pc_config = null;

  var sdpConstraints = {
      OfferToReceiveAudio : true,
      OfferToReceiveVideo : true

  };
  var mediaConstraints = {
      video : true,
      audio : true
  };

  var webRTCAdaptor = new WebRTCAdaptor({
      websocket_url : "ws://" + location.hostname + ":8081/WebRTCAppEE",
      mediaConstraints : mediaConstraints,
      peerconnection_config : pc_config,
      sdp_constraints : sdpConstraints,
      remoteVideoId : "remoteVideo",
      isPlayMode: true,
      callback : function(info) {
          if (info == "initialized") {
              console.log("initialized");

          } else if (info == "play_started") {
              //play_started
              console.log("play started");

          } else if (info == "play_finished") {
              // play finishedthe stream
              console.log("play finished");

          }
      },
      callbackError : function(error) {
          //some of the possible errors, NotFoundError, SecurityError,PermissionDeniedError

          console.log("error callback: " + error);
          alert(error);
      }
  });
  1. Call play(streamName) to Start Playing

In order to play WebRTC stream to Ant Media Server, WebRTCAdaptor’s play(streamName) function should be called.

You can choose the call this function in success callback function when the info parameter is having value “initialized”

if (info == "initialized")
{
 // it is called with this parameter when it connects to
 // Ant Media Server and everything is ok
 console.log("initialized");
 webRTCAdaptor.play("stream1");
}
  1. Call stop() to Stop Playing

You may want to stop play anytime by calling stop function of WebRTCAdaptor

webRTCAdaptor.stop()

Take a look atJavaScript Error Callbacksto handle callbacks gracefully

Sample

Please take a look at the WebRTCAppEE/player.html file in order to see How JavaScript SDK can be used for playing a stream

WebRTC WebSocket Messaging Details

This documentation is for developers who needs to implement signalling between Ant Media Server and clients for publishing & playing streams. Let’s make it step by step

Publishing WebRTC Stream

  1. Client connects to Ant Media Server through WebSocket. URL of the WebSocket interface is something like
ws://SERVER_NAME:5080/WebRTCAppEE/websocket
  1. Client sends publish JSON command to the server with stream name parameter. (Remove token parameter if token control is not enabled)
{
    command : "publish",
    streamId : "stream1",
    token : "tokenId",
}
  1. If Server accepts the stream, it replies back with start command
{
    command : "start",
    streamId : "stream1",
}
  1. Client inits peer connections, creates offer sdp and send the sdp configuration to the server with takeConfiguration command
{
   command : "takeConfiguration",
   streamId : "stream1",
   type : "offer",
   sdp : "${SDP_PARAMETER}"
}
  1. Server creates answer sdp and send the sdp configuration to the client with takeConfiguration command
{
   command : "takeConfiguration",
   streamId : "stream1",
   type : "answer",
   sdp : "${SDP_PARAMETER}"
}
  1. Client and Server get ice candidates several times and sends to each other with takeCandidate command
{
    command : "takeCandidate",
    streamId : "stream1",
    label : "${CANDIDATE.SDP_MLINE_INDEX}",
    id : "${CANDIDATE.SDP_MID}",
    candidate : "${CANDIDATE.CANDIDATE}"
}
  1. Clients sends stop JSON command to stop publishing
{
    command : "stop",
    streamId: "stream1"
}

Playing WebRTC Stream

  1. Client connects to Ant Media Server through WebSocket.
ws://SERVER_NAME:5080/WebRTCAppEE/websocket
  1. Client sends play JSON command to the server with stream name parameter. (Remove token parameter if token control is not enabled)
{
    command : "play",
    streamId : "stream1",
    token : "tokenId",

}
  1. If Server accepts the stream, it replies back with offer command
{
   command : "takeConfiguration",
   streamId : "stream1",
   type : "offer",
   sdp : "${SDP_PARAMETER}"
}
  1. Client creates answer sdp and send the sdp configuration to the server with takeConfiguration command
{
   command : "takeConfiguration",
   streamId : "stream1",
   type : "answer",
   sdp : "${SDP_PARAMETER}"
}
  1. Client and Server get ice candidates several times and sends to each other with takeCandidate command
{
    command : "takeCandidate",
    streamId : "stream1",
    label : "${CANDIDATE.SDP_MLINE_INDEX}",
    id : "${CANDIDATE.SDP_MID}",
    candidate : "${CANDIDATE.CANDIDATE}"
}
  1. Clients sends stop JSON command to stop playing
{
    command : "stop",
    streamId: "stream1",
}

Peer to Peer WebRTC Stream

  1. Peers connects to Ant Media Server through WebSocket.
ws://SERVER_NAME:5080/WebRTCAppEE/websocket
  1. Client sends join JSON command to the server with stream name parameter.
{
    command : "join",
    streamId : "stream1",
}

If there is only one peer in the stream1, server waits for the other peer to join the room.

  1. When second peer joins the stream, server sends start JSON command to the first peer
{
    command : "start",
    streamId : "stream1",
}
  1. First peer create offer sdp and send to the server with takeConfiguration command,
{
   command : "takeConfiguration",
   streamId : "stream1",
   type : "offer",
   sdp : "${SDP_PARAMETER}"
}

Server relays the offer sdp to the second peer

  1. Second peer creates answer sdp and sends to the server with takeConfiguration command
{
   command : "takeConfiguration",
   streamId : "stream1",
   type : "answer",
   sdp : "${SDP_PARAMETER}"
}

Server relays the answer sdp to the first peer

  1. Each peers get ice candidates several times and sends to each other with takeCandidate command through server
{
    command : "takeCandidate",
    streamId : "stream1",
    label : "${CANDIDATE.SDP_MLINE_INDEX}",
    id : "${CANDIDATE.SDP_MID}",
    candidate : "${CANDIDATE.CANDIDATE}"
}
  1. Clients sends leave JSON command to leave the room
{
    command : "leave",
    streamId: "stream1"
}

Conference WebRTC Stream

  1. Peers connects to Ant Media Server through WebSocket.
ws://SERVER_NAME:5080/WebRTCAppEE/websocket
  1. Client sends join JSON command to the server with room name parameter.
{
    command : "joinRoom",
    room : "room1",
}
  1. Server notifies the client with available streams in the room
{
    command : "notification",
    definition : "joinedTheRoom",
    streamId: "unique_stream_id_returned_by_the_server"
    streams: [
        "stream1_in_the_room",
        "stream2_in_the_room",
        .
        .
        .
    ]
}

streamId returned by the server is the stream id client uses to publish stream to the room. streams is the json array which client can play via WebRTC. Client can play each stream by play method above. This strams array can be empty if there is no stream in the room.

  1. When there is a new guy joined the room, server sends below message to each peer in the room.
{
    command : "notification",
    definition : "streamJoined",
    streamId: "new_stream_id_joined_the_room"

}

Client can play the new joined stream with the streamId by the play method above.

  1. When someone leaves the room, server sends the below message to each peer in the room.
{
    command : "notification",
    definition : "streamLeaved",
    streamId: "stream_id_leaved_the_room"
}

Client can update/remove the related video views from UI.

  1. Any user can leave the room by sending below message
{
    command : "leaveFromRoom",
    room: "roomName"
}

WebSocket Error Callbacks

  • noStreamNameSpecified: it is sent when stream id is not specified in the message.
{
    command : "error",
    definition : "noStreamNameSpecified",
}
  • not_allowed_unregistered_streams: This is sent back to the user if the publisher wants to send a stream with an unregistered id and server is configured not to allow this kind of streams
{
    command : "error",
    definition: "not_allowed_unregistered_streams",
}
  • no_room_specified: This is sent back to the user when there is no room specified in joining the video conference.
{
    command : "error",
    definition : "no_room_specified",
}
  • unauthorized_access:This is sent back to the user when the token is not validated
{
    command : "error",
    definition : "unauthorized_access",
}
  • no_encoder_settings:This is sent back to the user when there are no encoder settings available in publishing the stream.
{
    command : "error",
    definition : "no_encoder_settings",
}
  • no_peer_associated_before: This is peer to peer connection error definition.It is sent back to the user when there is no peer associated with the stream.
{
    command : "error",
    definition : "no_peer_associated_before",
}
  • notSetLocalDescription: It is send when local description is not set successfully
{
    command : "error",
    definition : "notSetLocalDescription",
}

WebRTC JS Error Callback Messages

This documentation is for developers who need to callbacks and their descriptions for WebRTC operations.

JavaScript Error Callbacks

  • ``WebSocketNotSupported``: WebSocket connection is not supported for environment or connection is not in the correct state.
  • ``AbortError``: Although the user and operating system both granted access to the hardware device, and no hardware issues occurred that would cause a NotReadableError, some problem occurred which prevented the device from being used.
  • ``NotAllowedError``: The user has specified that the current browsing instance is not permitted access to the device, or the user has denied access for the current session, or the user has denied all access to user media devices globally.
  • ``NotFoundError``: No media tracks of the type specified were found that satisfy the given constraints.
  • ``OverconstrainedError``: The specified constraints resulted in no candidate devices which met the criteria requested. The error is an object of type OverconstrainedError and has a constraint property whose string value is the name of a constraint which was impossible to meet, and a message property containing a human-readable string explaining the problem.
  • ``SecurityError``: User media support is disabled on the Document on which getUserMedia() was called. The mechanism by which user media support is enabled and disabled is left up to the individual user agent.
  • ``AudioAlreadyActive``: If there is audio it calls callbackError with “AudioAlreadyActive.
  • ``Camera or Mic is being used by some other process that does not let read the devices``: Error definition it is sent when media devices are used by another application.
  • ``VideoAlreadyActive``: If there is video it calls callbackError with “VideoAlreadyActive.
  • ``NotSupportedError``: Error definition it is sent when SSL is needed.
  • ``noStreamNameSpecified``: Error definition it is sent when stream id is not specified in the message.
  • ``not_allowed_unregistered_streams``: This is sent back to the user if the publisher wants to send a stream with an unregistered id and server is configured not to allow this kind of streams.
  • ``no_room_specified``: This is sent back to the user when there is no room specified in joining the video conference.
  • ``unauthorized_access``: This is sent back to the user when the token is not validated.
  • ``no_encoder_settings``: This is sent back to the user when there are no encoder settings available in publishing the stream.
  • ``no_peer_associated_before``: This is peer to peer connection error definition.It is sent back to the user when there is no peer associated with the stream.
  • ``notSetLocalDescription``: It is sent when local description is not set successfully.
  • ``screen_share_permission_denied``: It is sent when user does not allow screen share

WebRTC Signalling Server

Ant Media Server can also be used as a signalling server for peer to peer connections. WebSocket is used for connection between peers and Ant Media Server.

In order to use Ant Media Server as a WebSocket you just need to use an app that provides this feature. If you do not know, how to do that drop an email to contact at antmedia dot io

JavaScript SDK

Of course there is a JavaScript SDK in order to make using signalling server straight forward. There is a sample peer.html file in the sample app, you can also try it to understand how to use JavaScript SDK

How to use JavaScript SDK

JavaScript SDK is so easy, just create WebRTCAdaptor object and call join(roomName) function. Let’s see how to make it step by step

Load the below scripts in head element of the html file.
<head>
...
<script src="https://webrtc.github.io/adapter/adapter-latest.js"></script>
<script src="js/webrtc_adaptor.js" ></script>
...
</head>
Create video elements somewhere in the body tag
<video id="localVideo" autoplay muted width="480"></video>
<video id="remoteVideo" autoplay controls width="480"></video>

First video tag is for local video and the second video tag for remote video.

Initialize the `WebRTCAdaptor object in script tag
<script>
    var pc_config = null;

      var sdpConstraints =
      {
          OfferToReceiveAudio : true,
          OfferToReceiveVideo : true
      };
      var mediaConstraints = {
              video: true,
              audio: true
            };

      var webRTCAdaptor = new WebRTCAdaptor({
          websocket_url:"ws://" + location.hostname + ":8081/WebRTCApp4",  // url of the WebSocket Signalling Server
          mediaConstraints: mediaConstraints,
          peerconnection_config: pc_config,
          sdp_constraints: sdpConstraints,
          localVideoId: "localVideo",   // id of the local video tag
          remoteVideoId: "remoteVideo",  // id of the remote video tag

          callback: function(info) {     // *success callback function*

                    if (info == "initialized")
                    {
                        // it is called with this parameter when it connects to
                        // signalling server and everything is ok
                        console.log("initialized");
                    }
                    else if (info == "joined")
                    {
                       // it is called with this parameter when it joins a room
                       console.log("joined");
                    }
                    else if (info == "leaved")
                    {
                        // it is called with this parameter when it leaves from room
                        console.log("leaved");
                    }
                  },
          callbackError: function(error) {
                    // error callback function it is called when an error occurs
                    //some of the possible errors, NotFoundError, SecurityError,PermissionDeniedError
                    console.log("error callback: " + error);
                    alert(error);
          }
      });
</script>
Call join function

In order to create a connection between peers, each peer should join the same room by calling join(roomName) function of the WebRTCAdaptor. When there are two peers in the same room, signalling starts automatically and peer to peer connection is established.

For instance, you can call join function in success callback function when the info parameter is having value “initialized”

if (info == "initialized")
{
 // it is called with this parameter when it connects to
 // signalling server and everything is ok
 console.log("initialized");
 webRTCAdaptor.join("room1");
}

According to code above, when peer.html file is opened by two peers, they will joined the “room1” and peer to peer connection will be established.

Functions

As shown above, main object is WebRTCAdaptor so that let’s look at its functions

  • join(roomName) :

    Lets peer join to a room specified in the parameter, if operation is successfull then callback function is called with info parameter having “joined” value. When there are two people in the same room, signalling starts automatically and peer to peer connection is established

  • leave():

    Lets peer leave the room it joined previously. If operation is successfull then callback function is called with info parameter having “leaved” value

  • turnOnLocalCamera():

    Lets the local camera turn on and add the video stream to peer connection

  • turnOffLocalCamera():

    Lets the local camera turn off and remove the video stream from peer connection

  • unmuteLocalMic():

    Lets the local mic unmute and add the audio stream to peer connection

  • muteLocalMic():

    Lets the local mic mute and remove the audio stream from peer connection

Sample

Please take a look at the WebRTCApp4/peer.html file in order to see How JavaScript SDK can be used

WebRTC iOS SDK

This library simplifies connection between browsers and iOS Devices (iPhone and iPad for now) by using Ant Media server as a signaling server.

This documentation aims to introduce how to use AntMedia WebRTC iOS SDK in your application. This SDK supports three modes: Peer to peer, play and publish.

Features

  • Fully written in Swift 4.
  • Peer Connection: Two nodes connect to each other, 1-1 connection
  • Publish: One node publishes, 1-N connection
  • Play: Other nodes play broadcast from publisher,1-N connection
  • Allows to enable/disable camera and micrphone for local stream
  • Simple and concise codebase. You just need a few initialization and delegation.
  • Clear error messages

Requirements

In order to use this SDK, you need iOS 10+, WebRTC build, AntMedia WebRTC iOS SDK, Starscream library for signaling and Ant Media Server Enterprise Edition. Please contact us at contact@antmedia.io. We can provide WebRTC Android SDK and Enterprise Edition for trying or personal use.

Installation

  • We already built a version of WebRTC for iOS platform. Here is the link to download: https://antmedia.io/. After download, in Xcode, select your application as target and add WebRTC.framework as embedded binaries.
  • Next step is about signaling: This SDK uses Starscream to manage WebSocket connections. So if you use Pod or Carthage, please follow the steps: GitHub - daltoniam/Starscream
  • Now add AntMedia WebRTC iOS SDK to your project as embedded binaries.

Eventually, you should have two embedded binaries and Starscream via dependency management tool. Now you are ready to use SDK in your application.

Usage First, in ViewController, import AntMediaSDK and WebRTC libraries.

import AntMediaSDK
import WebRTC

Than we are ready to use AntMediaClient with just one line:

let client = AntMediaClient.init()

Before to start connection, we just need to set a few options: Local and remote video views as RTCEAGLVideoView which is provided by WebRTC and AntMediaClientDelegate to handle notifications that delivered by SDK. Let’s define these in ViewController:

class YourViewController: UIViewController {
    @IBOutlet weak var localVideoView: RTCEAGLVideoView!
    @IBOutlet weak var remoteVideoView: RTCEAGLVideoView!
}

And here is AntMediaClientDelegate:

extension YourViewController: AntMediaClientDelegate {
    // Add all protocol methods.
}

So we are ready to initialize and start AntMediaClient with using server url, stream id and mode:

client.delegate = self
client.setOptions(url: server, streamId: room, mode: self.getMode())
client.setVideoViews(local: localVideoView, remote: remoteVideoView)

If you run the Ant Media Server (Enterprise Edition) on your computer, server url should be localhost, if not ip address should be used. Please do not forget to the protocol (ws or wss).

Stream id is a kind of room name to make it easier finding each node.

There are three modes: P2P, Play and Publish. You can use AntMediaClientMode to use each one.

If you complete these steps, AntMediaClient instance is ready for connection:

client.connect()

After you call connect method, clientDidConnect or clientDidDisconnect methods will be called. If connection is okay, client is ready to start for streaming. You just need to call start method:

client.start()

Before to go, please do not forget to add Camera and Microphone usage description in Info.plist. Otherwise it will crash.

Delegation

There are 6 methods on AntMediaClientDelegate:

  • clientDidConnect: This method will be called if connection is okay with Ant Media Server.
  • clientDidDisconnect: This method will be called if connection fails. Message is available as an argument to handle what’s wrong: Network issues or server issues etc.
  • clientHasError: This method will be called if something goes wrong: Socket issues, if remote close the connection etc.
  • remoteStreamStarted: This method will be called after the remote source is available. This method is generally used for setting ratio and content mode.
  • remoteStreamRemoved: This method will be called if the remote source closes connection or leaves from room. This method is generally used for setting ratio and content mode.
  • localStreamStarted: This method will be called when camera capturing and microphone is ready to use.

Support

This SDK is still on the beta version. We appreciate if you share any experience with us. Please do not hesitate to open issue.

WebRTC Android SDK

WebRTC Native Android SDK lets developers build their own native WebRTC applications that with just some clicks. WebRTC Android SDK has built-in functions such as publishing to Ant Media Server for one-to-many streaming, playing a stream in Ant Media Server and lastly P2P communication by using Ant Media Server as a signalling server. Anyway, no need to say too much things let’s make our hands dirty.

Download the WebRTC Native Android SDK

We provide WebRTC Native Android SDK to Ant Media Server Enterprise users for free. Please keep in touch for getting WebRTC Native Android SDK. As a result download the SDK and open it to a directory that we will import it to Android App Project

Creating Android Project

Open Android Studio and Create a New Android Project

Just Click ``File > New > New Project`` . A window should open as shown below for the project details. Fill the form according to your Organization and Project Name

Create Android Studio Project For WebRTC Native Android SDK

Click Next button and Choose ``Phone and Tablet`` as below.

WebRTC Native Android App

Lastly Choose ``Empty Activity`` in the next window

Choose Empty Activity

Let the default settings for activity name and layout. Click Next and finish creating the project.

Import WebRTC SDK to Android Project

After creating the project. Let’s import the WebRTC Android SDK to the project. For doing that click ``File > New > Import Module`` . Choose directory of the WebRTC Android SDK and click Finish button.

Import Native WebRTC Android SDK

If module is not included in the project, add the module name into settings.gradle file as shown in the image below.

Import module in setting.gradle

Add dependency to Android Project App Module

Right click app, choose Open Module Settingsand click the Dependencies tab. Then a window should appear as below. Click the + button at the bottom and choose `Module Dependency``

Add Module Dependeny WebRTC Android SDK

Choose WebRTC Native Android SDK and click OK button

Native WebRTC Android SDK

CRITICAL thing about that You need import Module as an API as shown in the image below. You can change it from Implementation to API in the drop down list

Choose API in drop down list

So let’s do other simple stuff.

Prepare the App for Stream Publishing

Set permissions for the App

Open the AndroidManifest.xml and add below permissions between application and manifesttag

<uses-feature android:name="android.hardware.camera" />
   <uses-feature android:name="android.hardware.camera.autofocus" />
   <uses-feature
       android:glEsVersion="0x00020000"
       android:required="true" />

   <uses-permission android:name="android.permission.CAMERA" />
   <uses-permission android:name="android.permission.CHANGE_NETWORK_STATE" />
   <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
   <uses-permission android:name="android.permission.RECORD_AUDIO" />
   <uses-permission android:name="android.permission.BLUETOOTH" />
   <uses-permission android:name="android.permission.INTERNET" />
   <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
   <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />

   <uses-permission android:name="android.permission.READ_PHONE_STATE" />
   <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />

Implement MainActivity onCreate function

Open the MainActivity and implement it as below. You should change SERVER_URL according to your Ant Media Server address. Secondly, the third parameters in the last line of the code below is IWebRTCClient.MODE_PUBLISH that publishes stream to the server. You can use IWebRTCClient.MODE_PLAY for playing stream and IWebRTCClient.MODE_JOIN for P2P communication. If token control is enabled, you should define tokenId parameter.

public class MainActivity extends AbstractWebRTCActivity {

    public static final String SERVER_URL = "ws://192.168.1.21:5080/WebRTCAppEE/websocket";
    private CallFragment callFragment;


    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);

        //below exception handler show the exception in a popup window
        //it is better to use in development, do not use in production
        Thread.setDefaultUncaughtExceptionHandler(new UnhandledExceptionHandler(this));

        // Set window styles for fullscreen-window size. Needs to be done before
        // adding content.
        requestWindowFeature(Window.FEATURE_NO_TITLE);
        getWindow().addFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN | WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON
                | WindowManager.LayoutParams.FLAG_DISMISS_KEYGUARD | WindowManager.LayoutParams.FLAG_SHOW_WHEN_LOCKED
                | WindowManager.LayoutParams.FLAG_TURN_SCREEN_ON);
        getWindow().getDecorView().setSystemUiVisibility(getSystemUiVisibility());

        setContentView(R.layout.activity_main);

        webRTCClient = new WebRTCClient( this,this);

        String streamId = "stream" + (int)(Math.random() * 999);
        String tokenId = "tokenID";
        callFragment = new CallFragment();
        callFragment.setCallEvents(webRTCClient);
        callFragment.setStreamId(streamId);
        FragmentTransaction ft = getFragmentManager().beginTransaction();
        ft.add(R.id.call_fragment_container, callFragment);
        ft.commit();

        SurfaceViewRenderer cameraViewRenderer = findViewById(R.id.camera_view_renderer);

        webRTCClient.setFullScreenRenderer(cameraViewRenderer);

        checkPermissions();

        //streamId is randomly assigned and it will be shown to the screen to watch it on Ant Media Server
        webRTCClient.startStream(SERVER_URL, streamId, IWebRTCClient.MODE_PUBLISH, tokenId);

    }
}

WebRTCClient parameters are in below

void WebRTCClient.init(String url, String streamId, String mode, String token)

        @param url is websocket url to connect
        @param streamId is the stream id in the server to process
        @param mode one of the MODE_PUBLISH, MODE_PLAY, MODE_JOIN
        @param token is one time token string

        If mode is MODE_PUBLISH, stream with streamId field will be published to the Server
        if mode is MODE_PLAY, stream with streamId field will be played from the Server
void WebRTCClient.setOpenFrontCamera(boolean openFrontCamera)

        Camera open order
        By default front camera is attempted to be opened at first,
        if it is set to false, another camera that is not front will be tried to be open
        @param openFrontCamera if it is true, front camera will tried to be opened
                                                   if it is false, another camera that is not front will be tried to be opened
void WebRTCClient.startStream()

        Starts the streaming according to mode
void WebRTCClient.stopStream()

        Stops the streaming
void WebRTCClient.switchCamera()

        Switches the cameras
void WebRTCClient.switchVideoScaling(RendererCommon.ScalingType scalingType)

        Switches the video according to type and its aspect ratio
        @param scalingType
boolean WebRTCClient.toggleMic()

        toggle microphone
        @return Microphone Current Status (boolean)
void WebRTCClient.stopVideoSource()

        Stops the video source
void WebRTCClient.startVideoSource()

        Starts or restarts the video source
void WebRTCClient.setSwappedFeeds(boolean b)

        Swapped the fullscreen renderer and pip renderer
        @param b
void WebRTCClient.setVideoRenderers(SurfaceViewRenderer pipRenderer, SurfaceViewRenderer fullscreenRenderer)

        Set's the video renderers,
        @param pipRenderer can be nullable
        @param fullscreenRenderer cannot be nullable
String WebRTCClient.getError()

        Get the error
        @return error or null if not

Edit the activity_main.xml as below

<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".MainActivity">

    <org.webrtc.SurfaceViewRenderer
        android:id="@+id/camera_view_renderer"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:layout_gravity="center" />

    <FrameLayout
        android:id="@+id/call_fragment_container"
        android:layout_width="match_parent"
        android:layout_height="match_parent" />
</FrameLayout>

Build and Start the App

App directly publishes stream to the server before that we need to let the app has the permissions for that. Make sure that you let the app has permissions as shown below.

Permissions

Then restart the app and it should open the camera and start streaming. You should the see stream id in the screen as below. You can go to the http://SERVER_URL:5080/WebRTCAppEE/player.html, write stream id to the text box and click Play button.

Publish with WebRTC