📜 ⬆️ ⬇️

As I wrote a PeerJS (WebRTC) client for Android

Recently, I had to write a client application on Android for a server that organized video communication between users using the PeerJS library. This library is an add-on for WebRTC, or something like that.

He approached the matter with enthusiasm, as he had not done anything so complicated before.
Naturally, the first step was the search for libraries, projects that implement such functionality.
I found the sample WebRTC, but then I found a project that implemented all of this simpler.

I started to customize it, because the broker I use is peerjs.
Changed javascript code when connecting and started catching error messages. Not immediately, but I dare say that the thing is that the standard WebView (through which we execute the JavaScript code) does not support WebRTC.

“How to get around this problem” - I asked myself this question. Long googled and found nothing sensible.
I decided that it would be useful to dig into the PeerJS API and see how they implement everything.
')
I found several requests that the library sends, and which will be easy to reproduce, and also understand how peerjs connects clients. Via WebSocket!

After that, I came across a project and it decided everything!
Further I will result the code which finds room in one Activity.

Well, first, you need to take the daddy libs from the last mentioned project and add yourself to the project.
Next, create an Activity and add links to the objects that we will create later:
private static boolean factoryStaticInitialized; private GLSurfaceView surfaceView; private VideoRenderer.Callbacks localRender; private VideoRenderer.Callbacks remoteRender; private VideoRenderer localRenderer; private VideoSource videoSource; private VideoTrack videoTrack; private AudioTrack audioTrack; private MediaStream localMediaStream; private boolean videoSourceStopped; private boolean initiator = false; private boolean video = true; private boolean audio = true; private WebSocketClient client; private PeerConnectionFactory factory; private PeerConnection peerConnection; private final PCObserver pcObserver = new PCObserver(); private final SDPObserver sdpObserver = new SDPObserver(); private MediaConstraints sdpMediaConstraints; private LinkedList<PeerConnection.IceServer> iceServers = new LinkedList<PeerConnection.IceServer>(); private LinkedList<IceCandidate> queuedRemoteCandidates = new LinkedList<IceCandidate>(); private Toast logToast; private final Boolean[] quit = new Boolean[] { false }; private String id; private String token = ""; //    ,       private String connectionId = "mc_"; //    ,    "mc_" 


Create a GLSurfaceView in onCreate, in which we will display the video, and add it to some container:
  surfaceView = new GLSurfaceView(this); surfaceView.setLayoutParams(new LayoutParams(LayoutParams.MATCH_PARENT, LayoutParams.MATCH_PARENT)); LinearLayout content = (LinearLayout)findViewById(R.id.activity_webrtc_content); content.addView(surfaceView); 


Next, configure the video display locations:
  VideoRendererGui.setView(surfaceView); remoteRender = VideoRendererGui.create(0, 0, 100, 100); localRender = VideoRendererGui.create(1, 74, 25, 25); 

Here we indicate that the resulting video will be displayed at 100% of the height and width of the surfaceView, and the video from our camera will be displayed in the lower left corner, indented to the left 1% of the width, 74% of the height to the right, and 25%.

  if (!factoryStaticInitialized) { PeerConnectionFactory.initializeAndroidGlobals(this, true, true); factoryStaticInitialized = true; } audioManager = ((AudioManager) getSystemService(AUDIO_SERVICE)); @SuppressWarnings("deprecation") boolean isWiredHeadsetOn = audioManager.isWiredHeadsetOn(); audioManager.setMode(isWiredHeadsetOn ? AudioManager.MODE_IN_CALL : AudioManager.MODE_IN_COMMUNICATION); audioManager.setSpeakerphoneOn(!isWiredHeadsetOn); sdpMediaConstraints = new MediaConstraints(); sdpMediaConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveAudio", "true")); sdpMediaConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveVideo", "true")); iceServers.add(new PeerConnection.IceServer("stun:stun.l.google.com:19302")); createPC(); 

initializeAndroidGlobals - and do not ask why this method. I only know that without it, do not create a connection.
iceServers - read more on the Internet. Sometimes the video is not transmitted because you need to add more servers in the same way. And on the server side, if you plan to communicate from the site too, you need to add iceServer.

Next, we implement the createPC () method:
 void createPC(){ factory = new PeerConnectionFactory(); MediaConstraints pcConstraints = new MediaConstraints(); pcConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveAudio", "true")); pcConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveVideo", "true")); pcConstraints.optional.add(new MediaConstraints.KeyValuePair("RtpDataChannels", "true")); pcConstraints.optional.add(new MediaConstraints.KeyValuePair("DtlsSrtpKeyAgreement", "true")); peerConnection = factory.createPeerConnection(iceServers, pcConstraints, pcObserver); //       createDataChannelToRegressionTestBug2302(peerConnection); //  -   logAndToast("Creating local video source..."); MediaConstraints videoConstraints = new MediaConstraints(); videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxHeight", "240")); videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxWidth", "320")); //       localMediaStream = factory.createLocalMediaStream("ARDAMS"); VideoCapturer capturer = getVideoCapturer(); videoSource = factory.createVideoSource(capturer, videoConstraints); videoTrack = factory.createVideoTrack("ARDAMSv0", videoSource); localRenderer = new VideoRenderer(localRender); videoTrack.addRenderer(localRenderer); //  ,     localMediaStream.addTrack(videoTrack); audioTrack = factory.createAudioTrack("ARDAMSa0", factory.createAudioSource(new MediaConstraints())); //     localMediaStream.addTrack(audioTrack); peerConnection.addStream(localMediaStream, new MediaConstraints()); GetID getId = new GetID(); try { getId.execute(); //  ,   http-   id   }catch (Exception e) { logAndToast("No Internet connection"); disconnectAndExit(); } } 


  private class GetID extends AsyncTask<Void, Void, String>{ @Override protected String doInBackground(Void... params) { NetHelper a = NetHelper.getInstance(RTCActivity.this); //  http ,    id String result = a.executeHttpGet("http://0.peerjs.com:9000/" + roomKey + "/id?ts=" + Calendar.getInstance().getTimeInMillis() + ".7330598266421392"); // roomKey -  .      PeerJS    "lwjd5qra8257b9" // 7330598266421392 -   ,    id if (result==null) return null; result = result.replace("\n", ""); // id      ,     return result; } @Override protected void onPostExecute(String result) { super.onPostExecute(result); if (result==null) return; id = result; //  ,        WebSocketClient.Listener listener = new WebSocketClient.Listener() { @Override public void onMessage(byte[] arg0) { } @Override public void onMessage(final String data) { runOnUiThread(new Runnable() { public void run() { try { JSONObject json = new JSONObject(data); String type = (String) json.get("type"); if (type.equalsIgnoreCase("candidate")) { JSONObject jsonCandidate = json.getJSONObject("payload").getJSONObject("candidate"); IceCandidate candidate = new IceCandidate( (String) jsonCandidate.get("sdpMid"), jsonCandidate.getInt("sdpMLineIndex"), (String) jsonCandidate.get("candidate")); if (queuedRemoteCandidates != null) { queuedRemoteCandidates.add(candidate); } else { peerConnection.addIceCandidate(candidate); } } else if (type.equalsIgnoreCase("answer") || type.equalsIgnoreCase("offer")) { connectionId = json.getJSONObject("payload").getString("connectionId"); friendId = json.getString("src"); JSONObject jsonSdp = json.getJSONObject("payload").getJSONObject("sdp"); SessionDescription sdp = new SessionDescription( SessionDescription.Type.fromCanonicalForm(type), preferISAC((String) jsonSdp.get("sdp"))); peerConnection.setRemoteDescription(sdpObserver, sdp); } else if (type.equalsIgnoreCase("bye")) { logAndToast("Remote end hung up; dropping PeerConnection"); disconnectAndExit(); } else { //throw new RuntimeException("Unexpected message: " + data); } } catch (JSONException e) { //throw new RuntimeException(e); } } }); } @Override public void onError(Exception arg0) { runOnUiThread(new Runnable() { public void run() { disconnectAndExit(); } }); } @Override public void onDisconnect(int arg0, String arg1) { runOnUiThread(new Runnable() { public void run() { disconnectAndExit(); } }); } @Override public void onConnect() { //    runOnUiThread(new Runnable() { public void run() { if (initiator){ logAndToast("Creating offer..."); peerConnection.createOffer(sdpObserver, sdpMediaConstraints); } } }); } }; URI uri = null; try { //  URI  .   peerjs      uri = new URI("ws", "", "0.peerjs.com", 9000, "/peerjs", "key=" + roomKey + "&id=" + id + "&token=" + token, ""); // roomKey -  ,    // id -      id // token -    (     ) } catch (URISyntaxException e) { disconnectAndExit(); } client = new WebSocketClient(uri, listener, null); //    client.connect(); } } 

If you know the id (issued by the broker) of the user to which you want to connect, then in the socket's onConnect method, you need to create an offer.
If not, then do nothing.

the code from the socket listener must be executed in this method
 runOnUiThread(new Runnable() { public void run() { } }); 


In the onMessage (final String data) method we get the message. It may be:
- offer that another user sent to connect to us:
- answer that was sent by the user to whom we sent an offer to connect to it:
- Candidate - they contain information on iceServer. We add them to our connection.

I’ll draw your attention to the fact that messages have a structure defined by peerjs, so other brokers will have to parse them differently.

We implement this class:
  private class PCObserver implements PeerConnection.Observer { @Override public void onIceCandidate(final IceCandidate candidate){ runOnUiThread(new Runnable() { public void run() { JSONObject json = new JSONObject(); JSONObject payload = new JSONObject(); JSONObject jsonCandidate = new JSONObject(); jsonPut(json, "type", "CANDIDATE"); jsonPut(jsonCandidate, "sdpMid", candidate.sdpMid); jsonPut(jsonCandidate, "sdpMLineIndex", candidate.sdpMLineIndex); jsonPut(jsonCandidate, "candidate", candidate.sdp); jsonPut(payload, "candidate", jsonCandidate); jsonPut(payload, "type", "media"); jsonPut(payload, "connectionId", connectionId); jsonPut(json, "payload", payload); jsonPut(json, "dst", friendId); jsonPut(json, "src", id); sendMessage(json); } }); } @Override public void onError(){ runOnUiThread(new Runnable() { public void run() { disconnectAndExit(); } }); } @Override public void onSignalingChange(PeerConnection.SignalingState newState) { } @Override public void onIceConnectionChange(PeerConnection.IceConnectionState newState) { } @Override public void onIceGatheringChange(PeerConnection.IceGatheringState newState) { } @Override public void onAddStream(final MediaStream stream){ runOnUiThread(new Runnable() { public void run() { if (stream.videoTracks.size() == 1) { stream.videoTracks.get(0).addRenderer(new VideoRenderer(remoteRender)); } } }); } @Override public void onRemoveStream(final MediaStream stream){ runOnUiThread(new Runnable() { public void run() { stream.videoTracks.get(0).dispose(); } }); } @Override public void onDataChannel(final DataChannel dc) { } @Override public void onRenegotiationNeeded() { } } 

This class sends candidates to the second user and receives the incoming stream from video and audio.

And one more class:
 private class SDPObserver implements SdpObserver { private SessionDescription localSdp; @Override public void onCreateSuccess(final SessionDescription origSdp) { final SessionDescription sdp = new SessionDescription(origSdp.type, preferISAC(origSdp.description)); localSdp = sdp; runOnUiThread(new Runnable() { public void run() { peerConnection.setLocalDescription(sdpObserver, sdp); } }); } private void sendLocalDescription() { logAndToast("Sending " + localSdp.type); JSONObject json = new JSONObject(); JSONObject payload = new JSONObject(); JSONObject sdp = new JSONObject(); jsonPut(json, "type", localSdp.type.canonicalForm().toUpperCase()); jsonPut(sdp, "sdp", localSdp.description); jsonPut(sdp, "type", localSdp.type.canonicalForm().toLowerCase()); jsonPut(payload, "sdp", sdp); jsonPut(payload, "type", "media"); jsonPut(payload, "connectionId", connectionId); jsonPut(payload, "browser", "Chrome"); jsonPut(json, "payload", payload); jsonPut(json, "dst", friendId); sendMessage(json); } @Override public void onSetSuccess() { runOnUiThread(new Runnable() { public void run() { if (initiator) { if (peerConnection.getRemoteDescription() != null) { drainRemoteCandidates(); } else { sendLocalDescription(); } } else { if (peerConnection.getLocalDescription() == null) { logAndToast("Creating answer"); peerConnection.createAnswer(SDPObserver.this, sdpMediaConstraints); } else { sendLocalDescription(); drainRemoteCandidates(); } } } }); } @Override public void onCreateFailure(final String error) { } @Override public void onSetFailure(final String error) { } private void drainRemoteCandidates() { for (IceCandidate candidate : queuedRemoteCandidates) { peerConnection.addIceCandidate(candidate); } queuedRemoteCandidates = null; } } 

This class defines the SDP protocol settings by which information about iceServer and offer / answer is sent.
Offer / Answer also sends this class in the sendLocalDescription () method. Offer / Answer also have a specific peerjs structure.

Also add some secondary methods:
  //        private VideoCapturer getVideoCapturer() { String[] cameraFacing = { "front", "back" }; int[] cameraIndex = { 0, 1 }; int[] cameraOrientation = { 0, 90, 180, 270 }; for (String facing : cameraFacing) { for (int index : cameraIndex) { for (int orientation : cameraOrientation) { String name = "Camera " + index + ", Facing " + facing + ", Orientation " + orientation; VideoCapturer capturer = VideoCapturer.create(name); if (capturer != null) { logAndToast("Using camera: " + name); return capturer; } } } } return null; } @Override protected void onDestroy() { disconnectAndExit(); super.onDestroy(); } private void logAndToast(String msg) { Log.d(TAG, msg); if (logToast != null) { logToast.cancel(); } logToast = Toast.makeText(this, msg, Toast.LENGTH_SHORT); logToast.show(); } private void sendMessage(JSONObject json) { client.send(json.toString()); //   } private static void jsonPut(JSONObject json, String key, Object value) { try { json.put(key, value); } catch (JSONException e) { } } //  ,     .   sdp- private static String preferISAC(String sdpDescription) { String[] lines = sdpDescription.split("\r\n"); int mLineIndex = -1; String isac16kRtpMap = null; Pattern isac16kPattern = Pattern.compile("^a=rtpmap:(\\d+) ISAC/16000[\r]?$"); for (int i = 0; (i < lines.length) && (mLineIndex == -1 || isac16kRtpMap == null); ++i) { if (lines[i].startsWith("m=audio ")) { mLineIndex = i; continue; } Matcher isac16kMatcher = isac16kPattern.matcher(lines[i]); if (isac16kMatcher.matches()) { isac16kRtpMap = isac16kMatcher.group(1); continue; } } if (mLineIndex == -1) { Log.d(TAG, "No m=audio line, so can't prefer iSAC"); return sdpDescription; } if (isac16kRtpMap == null) { Log.d(TAG, "No ISAC/16000 line, so can't prefer iSAC"); return sdpDescription; } String[] origMLineParts = lines[mLineIndex].split(" "); StringBuilder newMLine = new StringBuilder(); int origPartIndex = 0; newMLine.append(origMLineParts[origPartIndex++]).append(" "); newMLine.append(origMLineParts[origPartIndex++]).append(" "); newMLine.append(origMLineParts[origPartIndex++]).append(" "); newMLine.append(isac16kRtpMap); for (; origPartIndex < origMLineParts.length; ++origPartIndex) { if (!origMLineParts[origPartIndex].equals(isac16kRtpMap)) { newMLine.append(" ").append(origMLineParts[origPartIndex]); } } lines[mLineIndex] = newMLine.toString(); StringBuilder newSdpDescription = new StringBuilder(); for (String line : lines) { newSdpDescription.append(line).append("\r\n"); } return newSdpDescription.toString(); } //      private void disconnectAndExit() { synchronized (quit[0]) { if (quit[0]) { return; } quit[0] = true; if (peerConnection != null) { peerConnection.dispose(); peerConnection = null; } if (client != null) { client.send("{\"type\": \"bye\"}"); client.disconnect(); client = null; } if (videoSource != null) { videoSource.dispose(); videoSource = null; } if (factory != null) { factory.dispose(); factory = null; } if (audioManager!=null) audioManager.abandonAudioFocus(audioFocusListener); finish(); } } @Override public void onStop() { disconnectAndExit(); super.onStop(); } @Override public void onPause() { super.onPause(); surfaceView.onPause(); if (videoSource != null) { videoSource.stop(); //    videoSourceStopped = true; } } @Override public void onResume() { super.onResume(); surfaceView.onResume(); if (videoSource != null && videoSourceStopped) { videoSource.restart(); //    } } //   -  private static void createDataChannelToRegressionTestBug2302(PeerConnection pc) { DataChannel dc = pc.createDataChannel("dcLabel", new DataChannel.Init()); dc.close(); dc.dispose(); } 


We have the variable initiator. By default, it is false. It means whether you are calling someone or waiting for a call.
I check in the asynchronous task the presence in the database on the server id of the user I'm calling. If found, then it is connected to the broker. I put initiator = true. When the socket is connected, an offer will be created immediately and we will connect to the second user.
If there is no user id, then we just wait. When someone wants to call us, he should find out our id and send an offer.

If you want to make buttons to turn off the transmission of video, audio, then the code will help you:
  final ImageView noVideo = (ImageView)findViewById(R.id.activity_webrtc_video); noVideo.setOnClickListener(new OnClickListener(){ @Override public void onClick(View v) { if (video){ noVideo.setImageResource(R.drawable.video_off); video = false; videoTrack.setEnabled(false); }else{ noVideo.setImageResource(R.drawable.video_on); video = true; videoTrack.setEnabled(true); } } }); final ImageView noAudio = (ImageView)findViewById(R.id.activity_webrtc_voice); noAudio.setOnClickListener(new OnClickListener(){ @Override public void onClick(View v) { if (audio){ noAudio.setImageResource(R.drawable.voice_off); audio = false; audioTrack.setEnabled(false); }else{ noAudio.setImageResource(R.drawable.voice_on); audio = true; audioTrack.setEnabled(true); } } }); 


Well, of course, we add activity to the manifest and permissions:
  <uses-permission android:name="android.permission.INTERNET" /> <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" /> <uses-permission android:name="android.permission.CAMERA" /> <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" /> <uses-permission android:name="android.permission.RECORD_AUDIO" /> <uses-feature android:name="android.hardware.camera"/> <uses-feature android:name="android.hardware.camera.autofocus" /> <uses-feature android:glEsVersion="0x00020000" android:required="true" /> 


Well, that's all. Please do not be offended at the confused story. And the quality of the code does not aspire to the standard.

I hope someone will come in handy.

Source: https://habr.com/ru/post/247079/


All Articles