I am trying to get video resolution when playing hls stream. I have typical player init:
let urlAsset = AVURLAsset(URL: currentVideoUrl)
self.player=AVPlayer(playerItem: AVPlayerItem(asset:urlAsset))
.......
I use KVO and i try to get video size when i get .ReadyToPlay status for AVPlayerItem:
func resolutionSizeForVideo() {
guard let videoTrack = self.player.currentItem?.asset.tracksWithMediaType(AVMediaTypeVideo).first else
{ return
}
let size = CGSizeApplyAffineTransform(videoTrack.naturalSize, videoTrack.preferredTransform)
let frameSize = CGSize(width: fabs(size.width), height: fabs(size.height))
print ("video size: \(frameSize)")
}
The problem is that tracksWithMediaType() always returns empty array (but works for non-stream files, e.g. for .mov).
How can i get size (CGRect) of the HLS video playing inside AVPlayer?
Tracks will always return nil when using HLS. If you have a UIView subclass that overrides its layerClass
with an AVPlayerLayer for playing the video you can get the size with
playerView.layer.videoRect
This is the size of just the video and not the entire layer.
Alternatively you can use KVO to observe the presentationSize of the item
player.addObserver(self, forKeyPath: "currentItem.presentationSize", options: [.Initial, .New], context: nil)
Are you able to log at least the video info using this method?
extension AVAsset{
func videoSize()->CGSize{
let tracks = self.tracks(withMediaType: AVMediaType.video)
if (tracks.count > 0){
let videoTrack = tracks[0]
let size = videoTrack.naturalSize
let txf = videoTrack.preferredTransform
let realVidSize = size.applying(txf)
print(videoTrack)
print(txf)
print(size)
print(realVidSize)
return realVidSize
}
return CGSize(width: 0, height: 0)
}
}
let videoAssetSource = AVAsset.init(URL: videoURL)
print("size:",videoAssetSource.videoSize())
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With