Given the following code and a device running iOS 7.1 or later:
 NSDictionary *fontTraitsDictionary = @{UIFontWeightTrait : @(-1.0)};
 NSDictionary *attributesDictionary = @{
                                       UIFontDescriptorFamilyAttribute : @"Helvetica Neue", 
                                       UIFontDescriptorTraitsAttribute : fontTraitsDictionary
                                       };
 UIFontDescriptor *ultraLightDescriptor = [UIFontDescriptor fontDescriptorWithFontAttributes:attributesDictionary];
 UIFont *shouldBeAnUltraLightFont = [UIFont fontWithDescriptor:ultraLightDescriptor size:24];
 NSLog(@"%@", shouldBeAnUltraLightFont);
I would expect the value of shouldBeAnUltraLightFont to be an instance of HelveticaNeue-UltraLight, but instead it is:
<UICTFont: 0x908d160> font-family: "Helvetica"; font-weight: normal; font-style: normal; font-size: 24.00pt
I am following the Apple documentation as far as I understand it. Why is the font family and font weight information completely ignored?
Things I’ve Tried
Regardless of these changes, the font returned is always a vanilla instance of Helvetica at normal weight.
I ran into the same issue, and the documentation was not much help. Eventually I figured out that using the family attribute combined with the face attribute worked:
 UIFontDescriptor* desc = [UIFontDescriptor fontDescriptorWithFontAttributes:
        @{
            UIFontDescriptorFamilyAttribute: @"Helvetica Neue",
            UIFontDescriptorFaceAttribute: @"Light"
        }
    ];
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With