Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Hidden Difference Between Unity 3D Quality Settings?

I am working on optimizing my game for iOS, and I'm running into a really strange problem with quality settings in Unity3D that has me completely stuck!

I have 4 quality settings setup in the game via:

Edit -> Project Settings -> Quality

The four settings are: Simple, Good, Beautiful, Fantastic

The short version is that regardless of the options I set inside the quality settings panel, Simple will always performs better than Good.

For example, if I set Simple and Good to have identical settings for all rendering options, setting everything to minimum (shadows off, no antialias, 1/8 Textures, etc), and test the game on an iPad starting with different default quality options:

Simple - ~26FPS
Good - ~6FPS 

If I change Simple to use 1/4 Textures, and leave Good at 1/8 Textures, it still performs almost the same:

Simple - ~24FPS
Good - ~6FPS 

I can clearly see the textures are better/more detailed in Simple mode at 1/4, but yet is runs almost 20FPS faster than Good mode which is using 1/8 size textures.

I can see from attaching the Unity iOS profiler that when running in Good mode the CPU suffers from a constant load of "Overhead", which is not present in Simple Mode (with identical quality settings). So something is clearly changing when I go from Simple to Good, but it doesn't appear to be anything I can change from the quality settings menu.

From reading the unity manual page I get the impression that the different quality settings are just simply placeholders as you can add/remove options. But from my experiments it seems that there is something else that changes when I start in Simple Vs Good (or other) options.

I have tried launching the game in Simple/Good as default, and also manually switching in code (Using C# via the QualitySettings.SetQualityLevel) and I get the same performance.

I feel like I am missing something key but I cannot seem to figure this one out.

like image 456
Jokode Avatar asked Mar 02 '15 18:03

Jokode


1 Answers

Quality levels can make differences when you use features which exists in quality levels and need some extra calculations. But when you doesn't use that features there is no difference between quality levels. If you want to improve you FPS there is some tips

  1. Make texture atlasing to reduce draw calls
  2. Improve calculations in Update functions and cache transforms used by Update function.
  3. Don't search components in every frame, in some cases you can use singleton pattern
  4. Don't destroy objects in every frame. Make object pooling.
  5. Improve particle Emissions and make as less as you can
  6. Improve shaders. If you use mobile use mobile shaders. For Example change Diffuse shader to Mobile/Diffuse.

Here is some useful links Object pooling video tutorial, Singleton pattern Unity3d c#

like image 103
haksist Avatar answered Oct 12 '22 23:10

haksist