I am running a simple function that get's called in multiple areas to help deal with layout on an iPad app during orientation changes. It looks like this:
- (void) getWidthAndHeightForOrientation:(UIInterfaceOrientation)orientation {
NSLog(@"New Orientation: %d",orientation);
end
And I call it in various places like this:
[self getWidthAndHeightForOrientation: [[UIDevice currentDevice] orientation]];
The function normally has some simple code that runs if the orientation is portrait or landscape. Unfortunately it wasn't working as expected when the app is started in what would be position 1. I get 0 as a result. Later if the function is called in the same manner but the device has never been rotated I get back a value of 5. What does this mean? Why would it throw these values?
In short why would [[UIDevice currentDevice] orientation] ever throw 0 or 5 instead of any value between 1 and 4?
UPDATE:
Because I kept finding bugs in my code due to the way orientation was handled, I wrote a definitive post on how to handle UIDevice or UIInterface orientations: http://www.donttrustthisguy.com/orientating-yourself-in-ios
Did you take a look at the enum values for UIInterfaceOrientation
? From the docs:
typedef enum {
UIDeviceOrientationUnknown,
UIDeviceOrientationPortrait,
UIDeviceOrientationPortraitUpsideDown,
UIDeviceOrientationLandscapeLeft,
UIDeviceOrientationLandscapeRight,
UIDeviceOrientationFaceUp,
UIDeviceOrientationFaceDown
} UIDeviceOrientation;
So it could conceivably be anything from 0-6.
Edit: Maybe you should be using the methods on your UIViewController
(willRotateToInterfaceOrientation:duration:
, etc.) instead of calling orientation
on the UIDevice
?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With