I am trying to stream audio to iOS and Safari clients using Apple's HTTP Live Streaming protocol. Unlike many common implementations of HTTP Live Streaming, my goal is to use short audio clips that by nature are of varying lengths, mostly in the 10-30 second range. In addition to streaming the audio from these segments, I would like to access metadata for each segment so I can update the display and/or give the user additional options to get more information about a particular audio segment.
Currently I've set up a few test cases that convert my source audio (MP3) to various formats and create streaming M3U files to test on iOS devices, but none of my approaches have worked properly (streaming correctly and passing metadata to the client). I am using AVPlayer
to load and play the created M3U files:
_playerItem = [AVPlayerItem playerItemWithURL:[NSURL URLWithString:@"http://localhost/sample.m3u8"]]
_player = [[AVPlayer alloc] initWithPlayerItem:_playerItem];
[_playerItem addObserver:self forKeyPath:@"timedMetadata" options:NSKeyValueObservingOptionNew context:NULL];
// ... wait for user input
[_player play];
Approach 1: Raw MP3 files
I took my original source MP3 files with id3v2 (v2.3.0) metadata and added them to an M3U playlist.
#EXTM3U
#EXT-X-TARGETDURATION:23
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:14
http://localhost/trk_01.mp3
#EXTINF:22
http://localhost/trk_02.mp3
#EXTINF:16
http://localhost/trk_03.mp3
#EXT-X-ENDLIST
Results: The timedMetadata
property is updated as soon as playback starts with the correct ID3 information for the first track. The first track plays, but cuts off near the end. ID3 data for the second track shows up, but the second track does not start playing. After a few moments I get an error to the console:
2011-04-26 07:04:52.668 TestClient[49756:601b] Prime: Exiting because mConverterError is '!buf' (0x800 req, 0x0 primed)
2011-04-26 07:04:52.668 TestClient[49756:601b] Prime failed ('!buf'); will stop (2048/0 frames)
Approach 2: Use Apple's mediafilesegmenter to create individual MP3 files
In this approach I use mediafilesegmenter
to create a new MP3 file for each segment. Apple's segmenting tool is normally used for, well, segmenting, but because my audio clips are all short and various lengths, this doesn't really fit my application. I pass a target duration of 999 seconds to the utility so that it creates a single output file for each input file I give it. Here is the command I use to create each individual track:
mediafilesegmenter -t 999 -f "$OUTPUT_DIR" "$INPUT_FILE" && cp $OUTPUT_DIR/fileSequence0.mp3 $OUTPUT_FILE
The resulting MP3 file seems to have some timestamp data, as vbindiff shows me a change in the file header and the string "com.apple.streaming.transportStreamTimestamp" shows up in the first few bytes of the new file. Researching that string brings up a passage in the HTTP Live Streaming draft specification:
Elementary Audio Stream files MUST signal the timestamp of the first sample in the file by prepending an ID3 PRIV tag [ID3] with an owner identifier of "com.apple.streaming.transportStreamTimestamp". The binary data MUST be a 33-bit MPEG-2 Program Elementary Stream timestamp expressed as a big-endian eight-octet number, with the upper 31 bits set to zero.
I then create an M3U file just as in Approach 1. (Note that using mediafilesegmenter I can also pass ID3 info using pre-created ID3 tag files and a meta-file describing ID3 time offsets. I'm skipping that here because I can't even get these files to play back correctly.)
Results: The first track is streamed just as in approach 1. The track again cuts off near the end, and the second track does not play. No metadata is present, but this can be added easily enough using mediafilesegmenter's -M option.
Approach 3: Use ffmpeg to create MPEG Transport Stream files
Using this final approach, I pass my source MP3 files through ffmpeg to create MPEG Transport Stream data:
ffmpeg -i "$INPUT_FILE" -f mpegts -acodec copy "$OUTPUT_FILE"
I then create an M3U just as in the first two approaches.
Results: This approach actually works; all of the files stream smoothly on the client. I'm unable to pass any metadata through to the client, however. I've tried passing arguments like -metadata title="My Title"
to ffmpeg with no luck.
M3U (or M3U8) is a plain text file format originally created to organize collections of MP3 files. The format is extended for HLS, where it's used to define media streams.
An index file, or playlist, provides an ordered list of the URLs of the media segment files. Index files for HTTP Live Streaming are saved as M3U8 playlists, an extension of the M3U format used for MP3 playlists. The URL of the index file is accessed by clients, which then request the indexed files in sequence.
HLS format is an adaptive bitrate live streaming video protocol. Originally developed by Apple for use on iOS, Mac OS, and Apple TV devices, HLS streaming has become the most widely used live video protocol.
HLS or HTTP Live Streaming is an HTTP-based adaptive bitrate video streaming protocol introduced by Apple in 2009 that describes a set of tools and procedures for streaming video and audio over the internet.
just a suggestion .. have you tried this project - https://github.com/DigitalDJ/AudioStreamer I am using this in my project and its good
update 1 -
you can transfer metadata info to one file to another using FFMPEG's param - "-map_meta_data"
here is an example -
ffmpeg -i /root/Desktop/new_tracks/02-drug-raps.mp3 -ab 24k /root/Desktop/new_tracks/converted/2.mp3 -map_meta_data /root/Desktop/new_tracks/02-drug-raps.mp3:/root/Desktop/new_tracks/converted/2.mp3;
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With