Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Custom video source for WebRTC on Android

Overview

I would like to use a custom video source to live stream video via WebRTC Android implementation. If I understand correctly, existing implementation only supports front and back facing cameras on Android phones. The following classes are relevant in this scenario:

  • Camera1Enumerator.java
  • VideoCapturer.java
  • PeerConnectionFactory
  • VideoSource.java
  • VideoTrack.java

Currently for using front facing camera on Android phone I'm doing the following steps:

CameraEnumerator enumerator = new Camera1Enumerator(false);
VideoCapturer videoCapturer = enumerator.createCapturer(deviceName, null);
VideoSource videoSource = peerConnectionFactory.createVideoSource(false);
videoCapturer.initialize(surfaceTextureHelper, this.getApplicationContext(), videoSource.getCapturerObserver());
VideoTrack localVideoTrack = peerConnectionFactory.createVideoTrack(VideoTrackID, videoSource);

My scenario

I've a callback handler that receives video buffer in byte array from custom video source:

public void onReceive(byte[] videoBuffer, int size) {}

How would I be able to send this byte array buffer? I'm not sure about the solution, but I think I would have to implement custom VideoCapturer?

Existing questions

This question might be relevant, though I'm not using libjingle library, only native WebRTC Android package.

Similar questions/articles:

  • for iOS platform but unfortunately I couldn't help with the answers.
  • for native C++ platform
  • article about native implementation
like image 424
Jernej Jerin Avatar asked Apr 11 '20 17:04

Jernej Jerin


People also ask

Can WebRTC be used in Android Apps?

You can use WebRTC facilities in the Android Platform with the help of Ant Media Server's Native WebRTC Android SDK.


Video Answer


1 Answers

There are two possible solutions to this problem:

  1. Implement custom VideoCapturer and create VideoFrame using byte[] stream data in onReceive handler. There actually exists a very good example of FileVideoCapturer, which implements VideoCapturer.
  2. Simply construct VideoFrame from NV21Buffer, which is created from our byte array stream data. Then we only need to use our previously created VideoSource to capture this frame. Example:
public void onReceive(byte[] videoBuffer, int size, int width, int height) {
    long timestampNS = TimeUnit.MILLISECONDS.toNanos(SystemClock.elapsedRealtime());
    NV21Buffer buffer = new NV21Buffer(videoBuffer, width, height, null);

    VideoFrame videoFrame = new VideoFrame(buffer, 0, timestampNS);
    videoSource.getCapturerObserver().onFrameCaptured(videoFrame);

    videoFrame.release();
}
like image 197
Jernej Jerin Avatar answered Oct 19 '22 20:10

Jernej Jerin