In sample logs posted in this question, the results are identical. Does anyone know if there is meant to be a logical difference between the two?
Even Apple's description is confusing. Here is the description of scale
:
The natural scale factor associated with the screen ... This value reflects the scale factor needed to convert from the default logical coordinate space into the device coordinate space of this screen...
Here is their description of nativeScale
:
The native scale factor for the physical screen
What is the difference between natural and native?
Both scale and nativeScale tell you how many pixels a point corresponds to. But keep in mind that points are rendered to an intermediate buffer of pixels, which is then resized to match the screen resolution. So, when we ask, "1 pt corresponds to how many pixels?" it might mean intermediate pixels (scale) or the final pixels (nativeScale).
On an iPhone 6 Plus (or equivalently sized device), scale is 3, but nativeScale is 2.6. This is because content is rendered at 3x (1 point = 3 pixels) but then the resulting bitmap is scaled down, resulting in 1 point = 2.6 pixels.
So scale deals with the intermediate bitmap, and nativeScale deals with the final bitmap.
This is without display zoom. If you enable display zoom, scale remains the same, at 3, since the intermediate buffer is still rendered at 1 point = 3 pixels. But native scale becomes 2.8.
So, if you want to check the physical screen, use scale. For example, if you have an app that runs only on the iPhone Plus, you could do:
if scale != 3 { print("Not supported") }
Not:
if nativeScale != 2.6 { print("Not supported") }
The second code fragment fails to do what was expected when the user enables display zoom.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With