I'm trying to stream RTSP/RTP iPhone camera capture to a Wowza server.
Apple's API does not allow direct access to H264 encoded frames, but only allow you to write it into a container '.mov' file.
Either way, I cannot get access to that file content until AVAssetWriter has finished writing, which doesn't allow me to stream live camera capture.
I've tried accessing it using named pipe in order to get access to the file's content on real-time but no success there - AVAssetWriter will not write to an existing file.
Does anyone know how to do it?
Thanks!
Edit: Starting on iOS 8, encoder & decoder has APIs
You can use a AVCaptureVideoDataOutput to process/stream each frame while the camera is running and AVAssetWriter to write the video file at the same time (appending each frame of the video data output queue).
See also Simultaneous AVCaptureVideoDataOutput and AVCaptureMovieFileOutput and Can use AVCaptureVideoDataOutput and AVCaptureMovieFileOutput at the same time?
Only solution i've found working so far,
is capturing without sound, then the file is written to the location you've defined.
Otherwise it's probably written to a temp location you can't reach.
Here is Apple's example for capturing video: AVCam
You'll need to remove sound channels.
If anyone has a better way, please publish it here.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With