So I created a new project with the latest version of XCode and tried to log the screen size of my app (to determine the device type for UI). I ran the following code from my iPhone 5:
NSLog(@"%f", [[UIScreen mainScreen] bounds].size.height);
This returned 480
, which is the screen size for the old iPhone family. I tried in the simulator and the same thing happened. Is there some property I have to enable in the project for it to recognize the screen size?
This only happens for 5+ devices; if I run the game on my iPad, it recognizes the 1024 screen size.
I know for a fact that this code has worked in the past. I made a game a while back using the exact same method and it had no problem detecting the screen size, but this was built in XCode 4.x.
I am using a custom View Controller, which I create in the App Delegate with the following code:
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions { self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]]; if([Global getDevice] == 1) { //iPhone 5+ self.window.rootViewController = [[FivePlus alloc] initWithNibName:nil bundle:nil]; } else if([Global getDevice] == 2) { //iPhone 4S- self.window.rootViewController = [[FourSMinus alloc] initWithNibName:nil bundle:nil]; } else { //iPad self.window.rootViewController = [[iPad alloc] initWithNibName:nil bundle:nil]; } [[self window] makeKeyAndVisible]; // Override point for customization after application launch. return YES; }
+ (int)getDevice { if([[UIScreen mainScreen] bounds].size.height == 568 || [[UIScreen mainScreen] bounds].size.width == 568) { return 1; } else if(UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPad) { return 3; } else { return 2; } }
A UIScreen object defines the properties associated with a hardware-based display. iOS devices have a main screen and zero or more attached screens. Each screen object defines the bounds rectangle for the associated display and other interesting properties.
An object that defines the properties associated with a hardware-based display. iOS 2.0+ iPadOS 2.0+ Mac Catalyst 13.1+ tvOS 9.0+
Apparently, iOS relies solely on the presence of a launch image in the resolution of an iPhone 5+ in order to let the app run in that resolution.
There are two solutions to this problem:
1. Use Asset Catalogs
When you create a new project, there's this thing called an asset catalog which stores your launch image files. Add one of these to your project and presto!
2. Dig out some old files
If you've been around XCode for a while, you'll know that in one of the later versions of XCode 4.x, the app automatically created three default launch image files for your app called Default.png
, [email protected]
, and [email protected]
. You need these files in your app, which are essentially just black images with the resolutions 480x320
, 960x640
, and 1136x640
, respectively (note that these are in HxW, not WxH).
Hopefully this helps someone else who encounters this ridiculous problem.
iOS will often "pretend" what screen size you have. Apple assumes for example that if you don't have the right launch image for some resolution, then you haven't designed your app to work properly in that resolution, so it will run your app in a different size. In an extreme case, an iPhone only app running on an iPad will return 320 x 480.
As far as your application is concerned, the screen size reported is the screen size available to your application. If it reports 320 x 480 then that is what your application can use. Anything drawn below 480 pixels will not be visible.
You convince iOS to run your app in the resolution that you want for example by supplying a launch image in the right size. In the case of iPhone 6 and 6+, the user can run them in "Zoom" mode so they behave as if they had the screen of an iPhone 5 or 6 (just physically bigger).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With