I am struggling to understand exactly what the point size in UIFont
means. It's not pixels and it doesn't appear to be the standard definition of point which is that they relate to 1/72th inch.
I worked out the pixel size using -[NSString sizeWithFont:]
of fonts at various sizes and got the following:
| Point Size | Pixel Size |
| ---------- | ---------- |
| 10.0 | 13.0 |
| 20.0 | 24.0 |
| 30.0 | 36.0 |
| 40.0 | 47.0 |
| 50.0 | 59.0 |
| 72.0 | 84.0 |
| 99.0 | 115.0 |
| 100.0 | 116.0 |
(I did [@"A" sizeWithFont:[UIFont systemFontOfSize:theSize]]
)
And looking at the 72.0
point size, that is not 1-inch since this is on a device with a DPI of 163, so 1-inch would be 163.0 pixels, right?
Can anyone explain what a "point" in UIFont
terms is then? i.e. is my method above wrong and really if I used something else I'd see something about the font is 163 pixels at 72 point? Or is it purely that a point is defined from something else?
The desktop publishing point (DTP point) or PostScript point is defined as 1⁄72 or 0.0138 of the international inch, making it equivalent to 25.4⁄72 mm = 0.3527 mm. Twelve points make up a pica, and six picas make an inch.
Point size measures from the height of the highest ascender (peak) to the baseline of the lowercase x. It then measures from the lowest descender (valley) of the font to the top of the lowercase x.
The letters in the Calibri font used by Stand Up For Democracy, when measured using an “E scale” ruler used by type designers, were less than 14/72 of an inch tall, which is the definition of 14-point type … (WSJ).
A font has an internal coordinate system, think of it as a unit square, within which a glyph's vector coordinates are specified at whatever arbitrary size accommodates all the glyphs in the font +- any amount of margin the font designer chooses.
At 72.0 points the font's unit square is one inch. Glyph x of font y has an arbitrary size in relation to this inch square. Thus a font designer can make a font that appears large or small in relation to other fonts. This is part of the font's 'character'.
So, drawing an 'A' at 72 points tells you that it will be twice as high as an 'A' drawn at 36 points in the same font - and absolutely nothing else about what the actual bitmap size will be.
ie For a given font the only way to determine the relationship between point size and pixels is to measure it.
I am not sure how -[NSString sizeWithFont:]
measures the height. Does it use line height or the difference between the peaks of the beziers? What text did you use?
I believe -[UIFont lineHeight]
would be better to measure the height.
Edit:
Also, note that none of the measurement methods returns the size in pixels. It returns the size in points
. You have to multiply the result by [UIScreen mainScreen].scale
.
Note the difference between typographic points
used when constructing the font and points
from iOS default logical coordinate space
. Unfortunately, the difference is not explained very clearly in the documentation.
I agree this is very confusing. I'm trying to give you some basic explanation here to make the things clearer.
First, the DPI (dot-per-inch) thing comes from printing, on physical papers. So does font. The unit point was invented to discribe physical printing size of text, just because inch is too large for usual text sizes. Then people invented point, that is the length of 1/72 inch (actually evolved in the history), to describe text size easily. So yes, if you are writing a document in Word or other word processing software for printing, you will get absolutely one-inch-height text if you use 72pt font.
Second, the theoretical text height is usually different from the rendered strokes you can actually see by your eyes. The original text height idea came from the actual glyphs used for printing. All letters are engraved on glyph blocks, which share the same height – which matches the font point height. However, depending on different letters and different font design, the actual visible part of the text may a little bit shorter than the theoretical height. Helvetica Neue is actually very standard. If you measure the top of a letter "k" to the bottom of a letter "p", it will match the font height.
Third, computer display screwed up DPI, as well as the definition of point at the same time. The resolution of computer displays are described by their native pixels, such as 1024 x 768 or 1920 x 1080. Software actually doesn't care the physical size of your monitors, because everything would be very fuzzy if they scale screen content like printing on paper — just the physical resolution is not high enough to make everything smooth and legit. Software uses a very simple and dead way: Fixed DPI for whatever monitor you use. For Windows, it's 96DPI; for Mac, it's 72DPI. That's said, no matter how many pixels make an inch on your monitor, software just ignores it. When the operating system renders text in 72pt, it would be always 96px high on Windows and 72px high on Mac. (That's why Microsoft Word documents always look smaller on Mac and you usually need zoom to 125%.)
Finally on iOS, it's very similar, no matter it's iPhone, iPod touch, iPad or Apple Watch, iOS uses the fixed 72DPI for non-retina screen, 144DPI for @2x retina display, and 216DPI for @3x retina display used on iPhone 6 Plus.
Forget about the real inch. It only exists on actual printing, not for displaying. For software displaying text on your screen, it's just an artificial ratio to physical pixels.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With