I have two simple programmes, server and client, both at localhost. What I want to do is to streaming the video from the server to the client through socket and the client can play it by using filediscriptor of the socket. First I try to send some message and the client can receive it. After that I send a few bytes of the video from the server'sd card to the client. The client can receive those bytes but cannot play it. Anyone know how to solve the problem?
Here is my server and client code snippets:
Server:
//Receive request from client.
Socket client=serversocket.accept();
System.out.println("accept");
//Receive client message.
BufferedReader in=new BufferedReader(new InputStreamReader(client.getInputStream()));
String str=in.readLine();
System.out.println("read:"+str);
//Send message to client.
//PrintWriter out=new PrintWriter(new BufferedWriter(new OutputStreamWriter(client.getOutputStream())),true);
//out.println("server message");
FileInputStream fis=new FileInputStream("/sdcard/toystory3.3gp");
byte buffer[]=new byte[2000];
fis.read(buffer,0,20);
DataOutputStream out=new DataOutputStream(client.getOutputStream());
out.write(buffer,0,20);
in.close();
out.close();
client.close();
System.out.println("close");
Client:
Socket socket=new Socket("127.0.0.1",4444);
String message="Initial"+"\r\n";
//Send message to server.
PrintWriter out=new PrintWriter(new BufferedWriter(new OutputStreamWriter(socket.getOutputStream())),true);
out.println(message);
//Receive message from server.
BufferedReader br=new BufferedReader(new InputStreamReader(socket.getInputStream()));
String msg=br.readLine();
//ParcelFileDescriptor pfd=ParcelFileDescriptor.fromSocket(socket);
//MediaPlayer m=new MediaPlayer();
//m.setDataSource(pfd.getFileDescriptor());
//m.prepare();
//m.start();
if(msg!=null)
{
System.out.println("Data received.");
System.out.println(msg);
}
else
{
System.out.println("Data not received.");
}
out.close();
br.close();
socket.close();
Using the WebRTC protocol, we can stream video in addition to audio and simply pipe it into an HTML video element instead of an audio element. In this recipe, we will create a peer-to-peer connection where we can allow two users to chat using live video.
For a server, you usually create a socket, then bind it to a specific port, and accept connections. For the client, you create a socket, and connect to a specified address (an IP address and port pair for a TCP/IP connection). The same device can run a TCP server and client at the same time.
SocketIo connection will get establish and video stream captured using webcam will be send to server frames by frames. These frames will be then processed at the backend and emit back to the client. Processed frames coming form the server can be shown in img tag.
This won't work because 3gp
(and other avi
derived files like mp4
, etc) have header (sic) at the end of the file. So any player must have access to the whole file.
RTSP/RTP
is the only way to stream the video at the moment. HTTP adaptive streaming is in the works.
Also if you're trying to do p2p video (device to device) you should know that all devices on operator networks are behind the NAT firewall. They can only open connections outbound. You'll need to use some kind of NAT-piercing.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With