Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to receive RTP Packets which are Streaming from RTP URL in iOS Device? (e.g. rtp://@225.0.0.0)

I am trying to stream RTP Packets (which is streaming an audio) from RTP URL e.g. rtp://@225.0.0.0 after so much research on the same i have somewhat streamed the URL in my device and playing it with https://github.com/maknapp/vlckitSwiftSample. This is only playing the Streamed Data but does not have any function to store the data.

From research and other sources i dint find much content and simple information that should be helpful to stream the Packet over RTP and store it in iOS Device.

I have tried with following link.

  1. https://github.com/kewlbear/FFmpeg-iOS-build-script

  2. https://github.com/chrisballinger/FFmpeg-iOS

These two are not even compiling due to POD Issues other projects or guide just giving me reference on RTSP Stream instead of RTP Stream.

If anyone can give us a guidance or any idea that how we can implement such things then it will be appreciated.

like image 229
User 1531343 Avatar asked Jan 19 '17 09:01

User 1531343


1 Answers

First foremost, you need to understand how this works.

The sender i.e. the creator of RTP stream is probably doing the following:

  1. Uses a source for the data: In case of audio, this could be the microphone or audio samples or a file
  2. Encodes the audio using a audio codec such as AAC or Opus.
  3. Uses RTP packetizer to create RTP packets from encoded audio frames
  4. Uses a transport layer such as UDP to send these packets

Protocols such as RTSP provides the necessary signaling information to provide better stream information. Usually RTP itself isn't enough as things such as congestion control, feedback, dynamic bit rate are handled with the help of RTCP.

Anyway, in order to store the incoming stream, you need to do the following:

  1. Use a RTP depacketizer to get the encoded audio frames out of it. You can write your own or use a third party implementation. In fact ffmpeg is a big framework which has all necessary code for most of the codecs and protocols. However for your case, find a simple RTP depacketizer. There could be headers corresponding to a particular codec to make sure you refer to a correct RFC.

  2. Once you have access to encoded frames, you can write the same in a media container such as m4a or ogg depending upon the audio codec used in the stream.

In order to play the stream, you need to do the following:

  1. Use a RTP depacketizer to get the encoded audio frames out of it. You can write your own or use a third party implementation. In fact ffmpeg is a big framework which has all necessary code for most of the codecs and protocols. However for your case, find a simple RTP depacketizer.

  2. Once you have access to encoded frames, use a audio decoder (available as a library) to decode the frames or check if your platform supports that codec directly for playback

  3. Once you have access to decoded frames, in iOS, you can use AVFoundation to play the same.

If you are looking at an easy way to do it, may be use a third party implementation such as http://audiokit.io/

like image 132
manishg Avatar answered Sep 22 '22 08:09

manishg