I am developing an app which has the functionality of sharing screens with other apps.
I used the Media projection API for this. I also used MediaMuxer to combine the audio and video outputs for screen sharing.
I know that Media Projection APIs are used for screen recording but all I want is to share the screen while recording.
For this, I have modified the writeSampleData method of the MediaMuxer class to send bytes via a socket to the other device over the network.
Below is the code for that:
OutputStream outStream;
outStream = ScreenRecordingActivity.getInstance().socket.getOutputStream();
void writeSampleData(final int trackIndex, final ByteBuffer byteBuf, final MediaCodec.BufferInfo bufferInfo) {
if (mStatredCount > 0) {
mMediaMuxer.writeSampleData(trackIndex, byteBuf, bufferInfo);
if (bufferInfo.size != 0) {
byteBuf.position(bufferInfo.offset);
byteBuf.limit(bufferInfo.offset + bufferInfo.size);
if (outStream != null) {
try {
byte[] bytes = new byte[byteBuf.remaining()];
byteBuf.get(bytes);
//Send the data
outStream.write(bytes);
outStream.flush();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
}
The bytes are successfully transferred via socket and I am also able to receive these bytes at the receiver's end.
Below is the code for receiving bytes at the receiver's end:
private class SocketThread implements Runnable {
@Override
public void run() {
Socket socket;
try {
serverSocket = new ServerSocket(SERVER_PORT);
} catch (IOException e) {
e.printStackTrace();
}
if (null != serverSocket) {
while (!Thread.currentThread().isInterrupted()) {
try {
socket = serverSocket.accept();
CommunicationThread commThread = new CommunicationThread(socket);
new Thread(commThread).start();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
class CommunicationThread implements Runnable {
InputStream in;
DataInputStream dis;
public CommunicationThread(Socket clientSocket) {
updateMessage("Server Started...");
}
public void run() {
while (!Thread.currentThread().isInterrupted()) {
try {
byte[] data = new byte[512];
} catch (Exception e) {
e.printStackTrace();
try {
fos.close();
} catch (Exception e1) {
e1.printStackTrace();
}
}
}
}
}
}
I followed the following links for Screen Sharing:
Screen capture
screenrecorder
Screen recording with mediaProjection
I used some code from the above examples to make an app.
All I want to know is how to handle the bytes at the receiver. How do I format these bytes to play a live stream from the sender's side?
Am I following the correct approach for sending and receiving byte data?
Does MediaProjection allow one to stream the Screen while recording between applications?
Any help will be deeply appreciated.
Before we integrate screen sharing into the application, there are a few changes that we need to make to the page layout. One is adding a “Share Screen” button, which we are going to place next to the “Join call” button. The current layout assumes that each participant will have a single video track which is presented with the name below it.
The Agora SDK does not provide any method for screen share on Android. You can, however, implement this function using the native screen-capture APIs provided by Android, and the custom video-source APIs provided by Agora.
This process is often the main process and communicates with the screen-sharing process via AIDL (Android Interface Definition Language). See Android documentation to learn more about AIDL. Screen-sharing process: Implemented with MediaProjection, VirtualDisplay, and custom video capture.
Note: Constraints never cause changes to the list of sources available for capture by the Screen Sharing API. This ensures that web applications can't force the user to share specific content by restricting the source list until only one item is left.
Generally for streaming, including screen sharing, the audio and video tracks are not muxed. Instead, each video frame and audio sample is sent using a protocol like RTP/RTSP, in which each data chunk is wrapped with other things like timestamps.
You can take a look at spyadroid which is a good starting point for streaming audio and video over RTSP to a browser or VLC. It streams the camera and microphone but you can adapt it for your own use case.
If you want to go with sockets for the moment, you have to get rid of the MediaMuxer
and send frames/samples directly from the Encoder output, appended with timestamps at least to synchronize the playback in the receiver side, after sending CSDs - assuming that you encode in h.264 format - data (SPS PPS aka csd-0 and csd-1 that you can get when the encoder format is changed) to the receiver Decoder, which you can configure with an output surface to render your stream.
Some extra links :
android-h264-stream-demo
RTMP Java Muxer for Android
RTSP
RTP
WebRTC
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With