Why does the UISlider view ignore the alpha view when set to 0.5?
Code:
for (int i = 0; i < 3; i++) {
UISlider *slider = [[[UISlider alloc]
initWithFrame:CGRectMake(0, i * 30, 200, 30)]
autorelease];
slider.alpha = 0.4 + (CGFloat)i / 10.0f;
[window addSubview:slider];
}
Result:
The sliders have alpha values 0.4, 0.5 and 0.6. And as you can see the middle one with 0.5 is completely opaque. It seams to only occur with alpha 0.5. Have tested other UI controllers and they work as expected with alpha is set to 0.5.
Reproduced with iOS 4.2 on real device and with iOS 3.2 and 4.2 in simulator.
BTW if someone curious how and why I hit this problem it's the sliding direction pad configuration for a puzzle game called Slippy.
As you said that other UI controllers work with 0.5 alpha, there should be no difference with UISlider
, since they inherit alpha
property from UIView
class and there is reference to the opaque
property ("You should always set the value of this property to NO if the view is fully or partially transparent"). Maybe you can try to follow the advice.
If there's really a bug with 0.5 value, you can simply change your starting transparency from 0.4 to 0.41/0.39 w/o any visible difference:
slider.alpha = 0.41f + (CGFloat)i / 10.0f;
Finally, you can output the resulting alpha
values to some labels to check if they are the expected ones or output the (CGFloat)i
value to see if something wrong with type casting.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With