Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

RTP iPhone camera - How to read AVAssetWriter file while its being written?

I'm trying to stream RTSP/RTP iPhone camera capture to a Wowza server.

Apple's API does not allow direct access to H264 encoded frames, but only allow you to write it into a container '.mov' file.

Either way, I cannot get access to that file content until AVAssetWriter has finished writing, which doesn't allow me to stream live camera capture.

I've tried accessing it using named pipe in order to get access to the file's content on real-time but no success there - AVAssetWriter will not write to an existing file.

Does anyone know how to do it?

Thanks!

Edit: Starting on iOS 8, encoder & decoder has APIs

like image 409
Avishay Cohen Avatar asked Jul 24 '12 15:07

Avishay Cohen


2 Answers

You can use a AVCaptureVideoDataOutput to process/stream each frame while the camera is running and AVAssetWriter to write the video file at the same time (appending each frame of the video data output queue).

See also Simultaneous AVCaptureVideoDataOutput and AVCaptureMovieFileOutput and Can use AVCaptureVideoDataOutput and AVCaptureMovieFileOutput at the same time?

like image 74
Eduard Feicho Avatar answered Sep 28 '22 01:09

Eduard Feicho


Only solution i've found working so far,
is capturing without sound, then the file is written to the location you've defined.
Otherwise it's probably written to a temp location you can't reach.

Here is Apple's example for capturing video: AVCam
You'll need to remove sound channels.

If anyone has a better way, please publish it here.

like image 29
Avishay Cohen Avatar answered Sep 28 '22 02:09

Avishay Cohen