Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Decode android's hardware encoded H264 camera feed using ffmpeg in real time

I'm trying to use the hardware H264 encoder on Android to create video from the camera, and use FFmpeg to mux in audio (all on the Android phone itself)

What I've accomplished so far is packetizing the H264 video into rtsp packets, and decoding it using VLC (over UDP), so I know the video is at least correctly formatted. However, I'm having trouble getting the video data to ffmpeg in a format it can understand.

I've tried sending the same rtsp packets to a port 5006 on localhost (over UDP), then providing ffmpeg with the sdp file that tells it which local port the video stream is coming in on and how to decode the video, if I understand rtsp streaming correctly. However this doesn't work and I'm having trouble diagnosing why, as ffmpeg just sits there waiting for input.

For reasons of latency and scalability I can't just send the video and audio to the server and mux it there, it has to be done on the phone, in as lightweight a manner as possible.

What I guess I'm looking for are suggestions as to how this can be accomplished. The optimal solution would be sending the packetized H264 video to ffmpeg over a pipe, but then I can't send ffmpeg the sdp file parameters it needs to decode the video.

I can provide more information on request, like how ffmpeg is compiled for Android but I doubt that's necessary.

Oh, and the way I start ffmpeg is through command line, I would really rather avoid mucking about with jni if that's at all possible.

And help would be much appreciated, thanks.

like image 390
joebobfrank Avatar asked Oct 12 '11 18:10

joebobfrank


2 Answers

Have you tried using java.lang.Runtime?

String[] parameters = {"ffmpeg", "other", "args"}; Program program Runtime.getRuntime().exec(parameters);  InputStream in = program.getInputStream(); OutputStream out = program.getOutputStream(); InputStream err = program.getErrorStream(); 

Then you write to stdout and read from stdin and stderr. It's not a pipe but it should be better than using a network interface.

like image 88
Douglas Jones Avatar answered Sep 17 '22 10:09

Douglas Jones


A little bit late but I think this is a good question and it doesn't have a good answer yet.

If you want to stream the camera and mic from an android device you have two main alternatives: Java or NDK implementations.

  1. Java implementation.

I'm only going to mention the idea but basically it is implement an RTSP Server and RTP Protocol in java based on these standards Real-Time Streaming Protocol Version 2.0 and RTP Payload Format for H.264 Video. This task will be very long and hard. But if you are doing your PhP it could be nice to have a nice RTSP Java lib for Android.

  1. NDK implementation.

This is alternative include various solutions. The main idea is to use a power C or C++ library in our Android application. For this instance, FFmpeg. This library can be compiled for Android and may support various architectures. The problem of this approach is that you may need to learn about the Android NDK, C and C++ to accomplish this.

But there is an alternative. You can wrap the c library and use the FFmpeg. But how?

For example, using FFmpeg Android, which has been compiled with x264, libass, fontconfig, freetype and fribidi and supports various architectures. But it still hard to program the if you want to stream in real-time you need to deal with file descriptors and in/out streams.

The best alternative, from a Java programming point of view, is to use JavaCV. JavaCV uses wrappers from commonly used libraries of computer vision that includes: (OpenCV, FFmpeg, etc, and provides utility classes to make their functionality easier to use on the Java platform, including (of course) Android.

JavaCV also comes with hardware accelerated full-screen image display (CanvasFrame and GLCanvasFrame), easy-to-use methods to execute code in parallel on multiple cores (Parallel), user-friendly geometric and color calibration of cameras and projectors (GeometricCalibrator, ProCamGeometricCalibrator, ProCamColorCalibrator), detection and matching of feature points (ObjectFinder), a set of classes that implement direct image alignment of projector-camera systems (mainly GNImageAligner, ProjectiveTransformer, ProjectiveColorTransformer, ProCamTransformer, and ReflectanceInitializer), a blob analysis package (Blobs), as well as miscellaneous functionality in the JavaCV class. Some of these classes also have an OpenCL and OpenGL counterpart, their names ending with CL or starting with GL, i.e.: JavaCVCL, GLCanvasFrame, etc.

But how can we use this solution?

Here we have a basic implementation to stream using UDP.

String streamURL = "udp://ip_destination:port"; recorder = new FFmpegFrameRecorder(streamURL, frameWidth, frameHeight, 1); recorder.setInterleaved(false); // video options // recorder.setFormat("mpegts"); recorder.setVideoOption("tune", "zerolatency"); recorder.setVideoOption("preset", "ultrafast"); recorder.setVideoBitrate(5 * 1024 * 1024); recorder.setFrameRate(30); recorder.setSampleRate(AUDIO_SAMPLE_RATE); recorder.setVideoCodec(AV_CODEC_ID_H264); recorder.setAudioCodec(AV_CODEC_ID_AAC); 

This part of the code shows how to initialize the FFmpegFrameRecorder object called recorder. This object will capture and encode the frames obtained from the camera and the samples obtained from the microphone.

If you want to capture a preview in the same Android app then we need to implement a CameraPreview Class this class will convert the raw data served from the Camera and it will create the Preview and the Frame for the FFmpegFrameRecorder.

Remember to replace the ip_destination with the ip of the pc or device where you want to send the stream. The port can be 8080 as example.

@Override public Mat onCameraFrame(Mat mat) {     if (audioRecordRunnable == null) {         startTime = System.currentTimeMillis();         return mat;     }     if (recording && mat != null) {         synchronized (semaphore) {             try {                 Frame frame = converterToMat.convert(mat);                 long t = 1000 * (System.currentTimeMillis() - startTime);                 if (t > recorder.getTimestamp()) {                     recorder.setTimestamp(t);                 }                 recorder.record(frame);             } catch (FFmpegFrameRecorder.Exception e) {                 LogHelper.i(TAG, e.getMessage());                 e.printStackTrace();             }         }     }     return mat; } 

This method shows the implementation of the onCameraFrame method that get the Mat (picture) from the camera and it is converted as a Frame and recorded by the FFmpegFrameRecorder object.

@Override public void onSampleReady(ShortBuffer audioData) {     if (recorder == null) return;     if (recording && audioData == null) return;      try {         long t = 1000 * (System.currentTimeMillis() - startTime);         if (t > recorder.getTimestamp()) {             recorder.setTimestamp(t);         }         LogHelper.e(TAG, "audioData: " + audioData);         recorder.recordSamples(audioData);     } catch (FFmpegFrameRecorder.Exception e) {         LogHelper.v(TAG, e.getMessage());         e.printStackTrace();     } } 

Same with the audio the audioData is a ShortBuffer object that will be recorder by the FFmpegFrameRecorder.

In the PC or device destination you can run the following command to get the stream.

ffplay udp://ip_source:port 

The ip_source is the ip of the smartphone that is streaming the camera and mic stream. The port must be the same 8080.

I created a solution in my github repository here: UDPAVStreamer.

Good luck

like image 37
Teocci Avatar answered Sep 18 '22 10:09

Teocci