This could be quite an intresting topic for people who are intrested in livestreaming from your device to a webserver. (Primary Android/Java) I have finally found a way on how to livestream from my device's camera to my webserver (website). On a wifi network it takes approx. 1 frame/ second to show on a wifi network, It also works on EDGE/3G network. In this topic/question, I want to discuss new techniques, improvements, ideas about livestreaming as I will share mine with yours (codes are appreciated too.)
My code repeatedly takes a snapshot from the camera preview using setOneShotPreviewCallback() to call onPreviewFrame(). The frame is delivered in YUV format so raw2jpg() converts it into 32 bit ARGB for the jpeg encoder. NV21 is a YUV planar format.
getPicture() is called, by the application, and produces the jpeg data for the image in the private byte array mCurrentFrame and returns that array.
After this, the byteArray mCurrentFrame gets Base64Encoded and send to my webserver in a HTTP POST method together with the string value of Base64 and a own ID code so people won't be able to also send another image to it. At the webserver, it gets decoded again and putted into the file test.jpg. PHP and Javascript is running on the webserver. PHP gets the POST method and JavaScript reloads the image every 750 seconds. This is basically how it works.
Now I am very intrested in your ideas, improvements and other things you would like to add/ask. Here are some of my questions:
1) What would be the best method for live streaming WITH audio? Video Recording OR my method + Audio recording?
2) How would you approach video record streaming?
3) How would you stream audio to the webserver? (Main goal) (With Java, PHP and JavaScript)
4) I am also planning to add typical live streaming feautures to i, e.g. when a famous person appears, you could have the ability to show his name while you are live streaming, or just add an image from your sd directory to your livestream. Would you also decode it and overlay the image, or put the image in your livestream in some way?
This topic is primarly for questions, so please this could be some great help for some people out here. Therefore I added a bounty of 50 (woot!) rep to it.
Sincerely, XverhelstX
What is live streaming? Live streaming technology lets you watch, create and share videos in real time, a bit like live TV. All you need to be able to live stream is an internet enabled device, like a smart phone or tablet, and a platform (such as a website or app) to live stream from.
It strikes me that http posting is probably not a good way to do live streaming of video to your server. Other people have been playing with live streaming and they've used a socket to broadcast live video streams and audio streams to their servers.
I thought this was fascinating -- here's a link.
http://www.mattakis.com/blog/kisg/20090708/broadcasting-video-with-android-without-writing-to-the-file-system
But the guy also posted a partial code sample -
String hostname = "your.host.name";
int port = 1234;
Socket socket = new Socket(InetAddress.getByName(hostname), port);
ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(socket);
MediaRecorder recorder = new MediaRecorder();
// Additional MediaRecorder setup (output format ... etc.) omitted
recorder.setOutputFile(pfd.getFileDescriptor());
recorder.prepare();
recorder.start();
The cool part I didn't know about is the ParcelFileDescriptor - which creates a file on the android filesystem that is actually a pipe to a socket - so anything that gets written to the file gets broadcast out over the web to a remote server. Sockets are the right way to go about doing this sort of thing too because they allow you to continuously send data until your recording is complete without having to re-send headers over and over.
What I think is cool about this technique is that he's literally taking the output from MediaRecorder (which is going to be an encoded video stream) and pumping it over a socket to his server. Then he can simply save out the data that's coming in over the socket. No frame by frame, no processing (Android SDK doesn't expose the encoders in the SDK very well and they're pretty performance intensive).
People report that it works, but I haven't tested. Anyway, hope this is helpful.
You are sending a whole snapshot each time? Why don't you try to use some video compressing techniques, like instead of sending a full image each time you send a compressed version (maybe a diff or something like that) and them on the server you create the image based on your last image and the data just received. I think all video codecs do this, you could try looking at some of the open codecs specification to get some ideas.
About audio. I would send the audio stream separated and them sync it with the video streaming based on which video frame we are showing right now.
Basically, I would try to get my streaming as close as possible to how a real video streaming works. Maybe you could look into ffmpeg, ffmpeg has an rtsp server if you could build that for android them you would simplify your work a lot.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With