Trying to get camera working from native code in Android ICS:
most manuals refer to startPreview() method . But browsing AOSP code I've found also 'startRecording()' method in <Camera.h>
. Here said that it is from interface ICameraRecordingProxy
"that allows the recorder to receive video frames during recording"
So the question is - in terms of performance is 'startRecording' approach more efficient than 'startPreview'?
The only one goal of going into native code is performace, Java 'Camera' is too slow, and OpenCV does not provide required level of FPS as well..
EDIT: target platform is: API level=17, device Allwinner A31 development board, 1280x720x30FPS. The task is to capture frames from camera, modify them, encode (H264) and store to SD card. pure java MediaRecorder write mp4 file with 1280x720x30. Show live preview on screen is not needed.
OpenCV-demo1 in native mode gives 1920x1080x2 (same in java mode). Simple java approach with empty PreviewCallback maximal FPS is 15.
Thank you in advance..
In terms of performance there is no gain in going for native camera. Using Camera.setPreviewCallbackWithBuffer() in Java (off UI thread) gives as many frames per second as any native alternative. But on some SOCs, e.g. Samsung, camera output may be directly (0-copy) wired with HW h264 encoder, which naturally gives excellent throughput. This is what the <quote>pure java MediaRecorder</quote> does under the hood. You cannot achieve same if any manipulation of the buffer should be involved.
to close topic: I was able to reach 1280x720 with FPS=30 using native access to camera and use hardware H264 encoder. Also can modify (watermark) data on the fly keeping FPS high. None of other approaches - any JAVA or OpenCV could give more that 15 FPS (may be I did not try hard..)
startRecording() works perfectly
thank you for comments
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With