Here is a complete project if you care to run this yourself: https://www.dropbox.com/s/5p384mogjzflvqk/AVPlayerLayerSoundOnlyBug_iOS10.zip?dl=0
This is a new problem on iOS 10, and it has been fixed as of iOS 10.2. After exporting a video using AVAssetExportSession and AVVideoCompositionCoreAnimationTool to composite a layer on top of the video during export, videos played in AVPlayerLayer fail to play. This doesn't seem to be caused by hitting the AV encode/decode pipeline limit because it often happens after a single export, which as far as I know only spins up 2 pipelines: 1 for the AVAssetExportSession and another for the AVPlayer. I am also setting the layer's frame properly, as you can see by running the code below which gives the layer a blue background you can plainly see.
After an export, waiting for some time before playing a video seems to make it far more reliable but that's not really an acceptable workaround to tell your users.
Any ideas on what's causing this or how I can fix or work around it? Have I messed something up or missing an important step or detail? Any help or pointers to documentation are much appreciated.
import UIKit
import AVFoundation
/* After exporting an AVAsset using AVAssetExportSession with AVVideoCompositionCoreAnimationTool, we
* will attempt to play a video using an AVPlayerLayer with a blue background.
*
* If you see the blue background and hear audio you're experiencing the missing-video bug. Otherwise
* try hitting the button again.
*/
class ViewController: UIViewController {
private var playerLayer: AVPlayerLayer?
private let button = UIButton()
private let indicator = UIActivityIndicatorView(activityIndicatorStyle: .gray)
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = UIColor.white
button.setTitle("Cause Trouble", for: .normal)
button.setTitleColor(UIColor.black, for: .normal)
button.addTarget(self, action: #selector(ViewController.buttonTapped), for: .touchUpInside)
view.addSubview(button)
button.translatesAutoresizingMaskIntoConstraints = false
NSLayoutConstraint.activate([
button.centerXAnchor.constraint(equalTo: view.centerXAnchor),
button.bottomAnchor.constraint(equalTo: view.bottomAnchor, constant: -16),
])
indicator.hidesWhenStopped = true
view.insertSubview(indicator, belowSubview: button)
indicator.translatesAutoresizingMaskIntoConstraints = false
NSLayoutConstraint.activate([
indicator.centerXAnchor.constraint(equalTo: button.centerXAnchor),
indicator.centerYAnchor.constraint(equalTo: button.centerYAnchor),
])
}
func buttonTapped() {
button.isHidden = true
indicator.startAnimating()
playerLayer?.removeFromSuperlayer()
let sourcePath = Bundle.main.path(forResource: "video.mov", ofType: nil)!
let sourceURL = URL(fileURLWithPath: sourcePath)
let sourceAsset = AVURLAsset(url: sourceURL)
//////////////////////////////////////////////////////////////////////
// STEP 1: Export a video using AVVideoCompositionCoreAnimationTool //
//////////////////////////////////////////////////////////////////////
let exportSession = { () -> AVAssetExportSession in
let sourceTrack = sourceAsset.tracks(withMediaType: AVMediaTypeVideo).first!
let parentLayer = CALayer()
parentLayer.frame = CGRect(origin: .zero, size: CGSize(width: 1280, height: 720))
let videoLayer = CALayer()
videoLayer.frame = parentLayer.bounds
parentLayer.addSublayer(videoLayer)
let composition = AVMutableVideoComposition(propertiesOf: sourceAsset)
composition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, in: parentLayer)
let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: sourceTrack)
layerInstruction.setTransform(sourceTrack.preferredTransform, at: kCMTimeZero)
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRange(start: kCMTimeZero, duration: sourceAsset.duration)
instruction.layerInstructions = [layerInstruction]
composition.instructions = [instruction]
let e = AVAssetExportSession(asset: sourceAsset, presetName: AVAssetExportPreset1280x720)!
e.videoComposition = composition
e.outputFileType = AVFileTypeQuickTimeMovie
e.timeRange = CMTimeRange(start: kCMTimeZero, duration: sourceAsset.duration)
let outputURL = URL(fileURLWithPath: NSTemporaryDirectory().appending("/out2.mov"))
_ = try? FileManager.default.removeItem(at: outputURL)
e.outputURL = outputURL
return e
}()
print("Exporting asset...")
exportSession.exportAsynchronously {
assert(exportSession.status == .completed)
//////////////////////////////////////////////
// STEP 2: Play a video in an AVPlayerLayer //
//////////////////////////////////////////////
DispatchQueue.main.async {
// Reuse player layer, shouldn't be hitting the AV pipeline limit
let playerItem = AVPlayerItem(asset: sourceAsset)
let layer = self.playerLayer ?? AVPlayerLayer()
if layer.player == nil {
layer.player = AVPlayer(playerItem: playerItem)
}
else {
layer.player?.replaceCurrentItem(with: playerItem)
}
layer.backgroundColor = UIColor.blue.cgColor
if UIDeviceOrientationIsPortrait(UIDevice.current.orientation) {
layer.frame = self.view.bounds
layer.bounds.size.height = layer.bounds.width * 9.0 / 16.0
}
else {
layer.frame = self.view.bounds.insetBy(dx: 0, dy: 60)
layer.bounds.size.width = layer.bounds.height * 16.0 / 9.0
}
self.view.layer.insertSublayer(layer, at: 0)
self.playerLayer = layer
layer.player?.play()
print("Playing a video in an AVPlayerLayer...")
self.button.isHidden = false
self.indicator.stopAnimating()
}
}
}
}
The answer for me in this case is to work around the issue with AVVideoCompositionCoreAnimationTool
by using a custom video compositing class implementing the AVVideoCompositing
protocol, and a custom composition instruction implementing the AVVideoCompositionInstruction
protocol. Because I need to overlay a CALayer
on top of the video I'm including that layer in the composition instruction instance.
You need to set the custom compositor on your video composition like so:
composition.customVideoCompositorClass = CustomVideoCompositor.self
and then set your custom instructions on it:
let instruction = CustomVideoCompositionInstruction(...) // whatever parameters you need and are required by the instruction protocol
composition.instructions = [instruction]
EDIT: Here is a working example of how to use a custom compositor to overlay a layer on a video using the GPU: https://github.com/samsonjs/LayerVideoCompositor ... original answer continues below
As for the compositor itself you can implement one if you watch the relevant WWDC sessions and check out their sample code. I cannot post the one I wrote here, but I am using CoreImage to do the heavy lifting in processing the AVAsynchronousVideoCompositionRequest
, making sure to use an OpenGL CoreImage context for best performance (if you do it on the CPU it will be abysmally slow). You also may need an auto-release pool if you get a memory usage spike during the export.
If you're overlaying a CALayer
like me then make sure to set layer.isGeometryFlipped = true
when you render that layer out to a CGImage
before sending it off to CoreImage. And make sure you cache the rendered CGImage
from frame to frame in your compositor.
We had the same issue on iOS 10 and 10.1. Looks fixed as of iOS 10.2 beta 3 though
To expand upon Sami Samhuri's answer, here's a small sample project I worked up that uses a custom AVVideoCompositing
class with custom instructions that implement AVVideoCompositionInstructionProtocol
https://github.com/claygarrett/CustomVideoCompositor
The project allows you to place a watermark over a video, but the idea could extend to do whatever you need. This prevents the AVPlayer bug in question from surfacing.
Another interesting solution on a separate thread that might help: AVPlayer playback fails while AVAssetExportSession is active as of iOS 10
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With