It's my first question here, so don't be severe.
I'm playing video from the net using AVPlayer. I output the current frame using AVPlayerItemVideoOutput attached to the AVPlayerItem played by the AVPlayer. To check if new frame is ready I call [AVPlayerItemVideoOutput hasNewPixelBufferForItemTime], then output it using OpenGL ES. All works perfectly if I read mp4, but if I try to read m3u8, it works for about 1 second(~30 frames), but after this period [AVPlayerItemVideoOutput hasNewPixelBufferForItemTime] starts returning FALSE only, so the current frame isn't updated.
In case I seek the current frame using [AVPlayer seekToTime] before this problem's first occur all goes normally.
Test m3u8 video I use lives here:
http://195.16.112.71/adaptive/3006a26a-9154-4b38-a327-4fa2a2381ae6.video/3006a26a-9154-4b38-a327-4fa2a2381ae6.m3u8
To reproduce this problem I modified Apple's sample AVPlayerDemo, here is it: https://yadi.sk/d/T2aVGoKnWmf5Z
The main change is in that I call [AVPlayerDemoPlaybackViewController update] which calls mentioned [AVPlayerItemVideoOutput hasNewPixelBufferForItemTime]. This function has static variable counter which stores the amount of successful [AVPlayerItemVideoOutput copyPixelBufferForItemTime] calls.
Video Url is set in [AVPlayerDemoPlaybackViewController setURL], it's hardcoded in the beginning of the function. By default its value points to m3u8 video, which reproduces the problem, in this case the average value of counter is about 30, after the frame with that index [AVPlayerItemVideoOutput hasNewPixelBufferForItemTime] returns FALSE only.
In the case when other video Url is used(see beginning of [AVPlayerDemoPlaybackViewController setURL] - there is an alternative Url you can uncomment), all the frames are successfully read.
Any help will be appreciated!
This function has static variable counter which stores the amount of successful [AVPlayerItemVideoOutput copyPixelBufferForItemTime] calls. Video Url is set in [AVPlayerDemoPlaybackViewController setURL], it's hardcoded in the beginning of the function.
Whether the AVPlayerItemOutput 's data should be rendered. An object that can respond to the delegate protocol for this type Registers an object for being observed externally (using NSString keyPath). Observed changes are dispatched to the observer’s object ObserveValue (NSString, NSObject, NSDictionary, IntPtr) method.
In the -observeValueForKeyPath: method we check whether the AVPlayer and AVPlayerItem are ready for playback. Once they are we seek to the correct starting time and set a flag, that is observed by our engine. Once the flag is YES, the engine calls -play on AVPlayer.
Edit: As for workarounds, AVPlayerLayer seems to work reliably but there is a constant stream of error messages to the console about the FBO being incomplete (which it's not). Are you suggesting that I attach an AVPlayerLayer even though it's useless in my OpenGL based rendering?
Make sure that AVPlayerItem.status
equals AVPlayerItemStatusReadyToPlay
before calling - (void)addOutput:(AVPlayerItemOutput *)output
method on AVPlayerItem
Reference:Renaud's reply on this page
I got the same problem with my implementation. After trying the solutions proposed here, I think I finally found the relable way to do things
The
AVPlayerItemVideoOutput
must be created AFTER theAVPlayerItem
status is ready to play.So
Create player & player item, dispatch queue and display link
Register observer for
AVPlayerItem
status keyOn status
AVPlayerStatusReadyToPlay
, createAVPlayerItemVideoOutput
and start display linkThanks to all for the inspiration
Renaud
Code blow does not solve my problem, I still got nothing from [AVPlayerItemVideoOutput hasNewPixelBufferForItemTime]
if (failedCount > 100) {
failedCount = 0;
[_playerItem removeOutput:_output];
[_playerItem addOutput:_output];
}
Finally after testing my code for a whole day. I found a way to solve it.
#pragma mark - AVPlayerItemOutputPullDelegate
- (void)outputMediaDataWillChange:(AVPlayerItemOutput *)sender {
if (![self.videoOutput hasNewPixelBufferForItemTime:CMTimeMake(1, 10)]) {
[self configVideoOutput];
}
[self.displayLink setPaused:NO];
}
Check [AVPlayerItemVideoOutput hasNewPixelBufferForItemTime]
when outputMediaDataWillChange:
called. Recreate your AVPlayerItemVideoOutput
if no new pixel buffer at 0.1s.
Code in [self configVideoOutput];
just recreate a new AVPlayerItemVideoOutput
to replace current videoOutput
property.
Why 0.1s?
I tested and experimented many times I found that first 1 or 2 frame may always get no pixel buffer. So first 1/30s, 2/30s (for video at 30fps) may have no frame and pixel buffer. But if no video pixel buffer after 0.1s, the video output may broken or something problem with it. So we need recreate it.
I noticed that AVPlayerItemVideoOutput "jams" somehow when using HLS multibitrate playlists. When player changes to higher bitrate -> trackid of playeritems video track changes -> it will got few pixelbuffers but after that hasNewPixelBufferForItemTime will return NO always.
I have spent days with this problem. Accidentally I noticed that if I go to background and after that back to foreground -> Video will play normally with higher bitrate. This is not the solution.
Finally I found workaround to this problem. I set counter for failed pixelbuffers, after 100 fails I remove current output from playeritem and set same instance back.
if (failedCount > 100)
{
failedCount = 0;
[_playerItem removeOutput:_output];
[_playerItem addOutput:_output];
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With