Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Seeking explanation for the difference in animation performance between iOS6 and iOS7

I have been working on an iPad app that performs animations on very large images (full screen images that can be zoomed at 2x and still be retina quality). I have spent a lot of time getting smooth transitions when zooming and panning. When running the app on iOS7 however, the animations become really jerky (slow frame rate).

Further testing shows that it is the zoom animation that causes the problem (panning does not cause a problem). Interestingly, I have been able to fix it by setting the alpha of the image being scaled to 0.995 (instead of 1.0).

I have two questions

  1. What has changed in iOS7 to make this happen?
  2. Why does changing the opacity of the view make a difference?

Further information for the above questions:

Animation Setup

The animations are all pre-defined and are played upon user interaction. The animations are all a mix of pan and zoom. The animations are really simple:

[UIView animateWithDuration:animationDuration delay:animationDelay options:UIViewAnimationOptionCurveEaseInOut animations:^{
    self.frame = nextFrame;
    //...
} completion:^(BOOL finished) {
    //...
}];

To fix the jerky animation, I set the alpha before the animation

self.alpha = 0.99;

Some interesting points:

  1. Setting the alpha inside of the animation works as well
  2. Setting the alpha back to 1.0 after the animation and then doing the reverse animation with a 1.0 alpha does not give a smooth reverse animation.

Opacity fix

I have previously used the opacity fix to make animations smooth when scaling and panning multiple images. For example, I had two large images panning and scaling at different speeds with one on top of the other. When a previously un-rendered part of the lower image (the image on the bottom) became visible, the animation would become jerky (panning as well as scaling). My thought for why alpha helps in this case is, if the top image has a bit of transparency, the bottom image must always be rendered, which means it can be cached before the animation takes place. This thought is backed by doing the reverse animation and not seeing the jerky animation. (I guess I would be interested to know if anyone has different thoughts on this as well).

Having said the above, I don't know how this would have an affect when there is just one image (as in the situation I am describing in my question). Particularly when after getting the jerky animation, the reverse animation is still jerky. Another point of difference between the two situations is that it is only scaling that causes the problem in the current issue, while in the double image issue it was panning as well as scaling.


I hope the above is clear - any insights appreciated.

like image 961
lindon fox Avatar asked Sep 26 '13 03:09

lindon fox


1 Answers

Look at Group Opacity. iOS 7 has that turned ON by default and this changes the way views/layers are composited:

When the UIViewGroupOpacity key is not present, the default value is now YES. The default was previously NO.

This means that subviews of a transparent view will first be composited onto that transparent view, then the precomposited subtree will be drawn as a whole onto the background. A NO setting results in less expensive, but also less accurate, compositing: each view in the transparent subtree is composited onto what’s underneath it, according to the parent’s opacity, in the normal painter’s algorithm order.

(source: iOS7 Release Notes)

With this setting on, compositing - also during animations - is way more expensive.

Also, have a look at the CoreGraphics Instruments tool to check if you have lots of off-screen images compositing going on.

Are you having any sort of changes going on in the view being animated? That would trigger more discarding of the rendered layer image from the backing store.

like image 91
Cocoanetics Avatar answered Sep 29 '22 11:09

Cocoanetics