The specifications for TrueType and OpenType specify a checkSumAdjustment in the 'head' or 'bhed' table of an Sfnt. Both specifications describe how to calculate this value but I can't find any information on why this value exists and what it is used for.
Bonus question: Why do I have to subtract from 0xB1B0AFBA?
The point of this value is to allow font engines to detect corruption in the font without actually having to parse all the font data first. Ideally, the checksum would be all the way at the start of the file, but thanks to needing to unify various font formats, isn't. Instead it's in the head table. Silly, but we're stuck with it.
Each table in a font has its own checksum value, so that the engine can verify that parts of a font are correct "as is", but to make things easier the font itself also has a master checksum that's even easier to compute (find its value offset in the byte stream by parsing a minimal amount of data, then sum the entire bytestream as LONGs while treating the four bytes where this checksum is located as 0x00000000) and can be used to determine whether the font followed the OpenType spec when it was encoded without needing to look up what every table says its checksum is, where it starts, how long it is, and then running the same checksum computation several times for different parts of the byte stream. If the master checksum fails, it doesn't even matter if the checksums for individual tables turn out to be correct: there's something wonky about this font.
The subtraction from 0xB1B0AFBA is pretty much just "for historical reasons" because OpenType unified several specs, rather than starting from scratch, so there's some baggage from older formats left in it (the "OS/2" table, for instance, is a general metadata table and has nothing to do with the OS/2 (Warp) operating system anymore, nor has it for a very long time).
The checksum adjustment field ensures that all truetype/opentype files have an overall checksum of 0xB1B0AFBA
.
I assume this was done so that you can (weakly) validate the file by comparing its checksum to this fixed value, rather than having to read the expected value from within the font.
The reason this works is: By calculating the sum with checksumAdjustment set to 0, then storing N − old_sum
in that field, the final sum becomes old_sum + (N − old_sum)
, which is always N
.
I don't know why they chose 0xB1B0AFBA
in particular, though.
The checksum algorithm is listed here: https://learn.microsoft.com/en-us/typography/opentype/spec/otff#calculating-checksums
Here is a working implementation which doesn't rely on endianness:
#include <stdint.h>
#include <stdio.h>
int main(void) {
uint32_t sum = 0;
int step = 3;
int c;
while ((c = getchar()) >= 0)
sum += c << (step-- & 3)*8;
printf("sum: %#8.8x\n", sum);
return 0;
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With