Perhaps you have noticed one of the latest trend in iOS-apps: Using videos as backgrounds - mainly at login- or "first launch" screens. Yesterday I attempted to mimic this with a very simple test project (only one view controller) and I am pleased with the results except for the performance. When trying it out in the iOS Simulator (on a simulated iPhone 6) the CPU usage fluctuates between 70-110%. This seems very unreasonable for a simple login-screen.
This is what it looks like in action: http://oi57.tinypic.com/nqqntv.jpg
The question is then: Is there a more CPU-effective way to achieve this? How are the apps like Vine, Spotify and Instagram doing this?
Before you answer; the method I used was a full-HD video played back using MPMoviePlayerController:
- (void)viewDidLoad {
[super viewDidLoad];
// find movie file
NSString *moviePath = [[NSBundle mainBundle] pathForResource:@"arenaVideo" ofType:@"mp4"];
NSURL *movieURL = [NSURL fileURLWithPath:moviePath];
// load movie
self.moviePlayer = [[MPMoviePlayerController alloc] initWithContentURL:movieURL];
self.moviePlayer.controlStyle = MPMovieControlStyleNone;
self.moviePlayer.view.frame = self.view.frame;
self.moviePlayer.scalingMode = MPMovieScalingModeAspectFill;
[self.view addSubview:self.moviePlayer.view];
[self.view sendSubviewToBack:self.moviePlayer.view];
[self.moviePlayer play];
// loop movie
[[NSNotificationCenter defaultCenter] addObserver: self
selector: @selector(replayMovie:)
name: MPMoviePlayerPlaybackDidFinishNotification
object: self.moviePlayer];
}
#pragma mark - Helper methods
-(void)replayMovie:(NSNotification *)notification
{
[self.moviePlayer play];
}
Of course the edges of the video could have been trimmed so that the resolution would be something more along the lines of say 700x1080 instead of 1920x1080 but would that have made a huge difference in performance? Or should I compress the video with a certain format and settings to achieve optimal performance? Maybe there is an entirely alternate approach to this?
Actually I tried using GIFs as described in this article: https://medium.com/swift-programming/ios-make-an-awesome-video-background-view-objective-c-swift-318e1d71d0a2
The problem with that is:
In the Photos app , you can change the focus subject where the effect is applied, and adjust the level of background blur—or depth of field—in your Cinematic mode videos. You can also turn off the effect.
Best way is to use AVFoundation
then you control the video layer itself
In header file declare @property (nonatomic, strong) AVPlayerLayer *playerLayer;
- (void)viewDidLoad {
[super viewDidLoad];
[self.view.layer addSublayer:self.playerLayer];
// loop movie
[[NSNotificationCenter defaultCenter] addObserver: self
selector: @selector(replayMovie:)
name: AVPlayerItemDidPlayToEndTimeNotification
object:nil];
}
-(AVPlayerLayer*)playerLayer{
if(!_playerLayer){
// find movie file
NSString *moviePath = [[NSBundle mainBundle] pathForResource:@"arenaVideo" ofType:@"mp4"];
NSURL *movieURL = [NSURL fileURLWithPath:moviePath];
_playerLayer = [AVPlayerLayer playerLayerWithPlayer:[[AVPlayer alloc]initWithURL:movieURL]];
_playerLayer.frame = CGRectMake(0,0,self.view.frame.size.width, self.view.frame.size.height);
[_playerLayer.player play];
}
return _playerLayer
}
-(void)replayMovie:(NSNotification *)notification
{
[self.playerLayer.player play];
}
Swift 2.0
lazy var playerLayer:AVPlayerLayer = {
let player = AVPlayer(URL: NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("LaunchMovie", ofType: "mov")!))
player.muted = true
player.allowsExternalPlayback = false
player.appliesMediaSelectionCriteriaAutomatically = false
var error:NSError?
// This is needed so it would not cut off users audio (if listening to music etc.
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryAmbient)
} catch var error1 as NSError {
error = error1
} catch {
fatalError()
}
if error != nil {
print(error)
}
var playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = self.view.frame
playerLayer.videoGravity = "AVLayerVideoGravityResizeAspectFill"
playerLayer.backgroundColor = UIColor.blackColor().CGColor
player.play()
NSNotificationCenter.defaultCenter().addObserver(self, selector:"playerDidReachEnd", name:AVPlayerItemDidPlayToEndTimeNotification, object:nil)
return playerLayer
}()
override func viewDidLoad() {
super.viewDidLoad()
self.view.layer.addSublayer(self.playerLayer)
}
override func viewWillDisappear(animated: Bool) {
NSNotificationCenter.defaultCenter().removeObserver(self)
}
// If orientation changes
override func willAnimateRotationToInterfaceOrientation(toInterfaceOrientation: UIInterfaceOrientation, duration: NSTimeInterval) {
playerLayer.frame = self.view.frame
}
func playerDidReachEnd(){
self.playerLayer.player!.seekToTime(kCMTimeZero)
self.playerLayer.player!.play()
}
Tested on iOS7 - iOS9
I realize that this is an old post but being that I have had some experience bringing down the CPU usage in my iOS app, I'll respond.
first place to look is use AVFoundationFramework
Implementing AVPlayer should help bring down the CPU a little
but the best solution is to use Brad Larson's GPUImage library which utilizes OpenGl and will reduce the CPU usage greatly. Download the library and there are examples of how to use. I recommend using GPUImageMovieWriter
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With