If I have a certain number of bytes to transfer serially, how do I determine which CRC (CRC8, CRC16, etc., basically a how many bit CRC?) to use and still have error detection percentage be high? Is there a formula for this?
The most commonly used polynomial lengths are 9 bits (CRC-8), 17 bits (CRC-16), 33 bits (CRC-32), and 65 bits (CRC-64). A CRC is called an n-bit CRC when its check value is n-bits. For a given n, multiple CRCs are possible, each with a different polynomial.
From a standpoint of the length of the CRC, normal statistics apply. For a bit-width of CRC, you have 1/(2^n)
chance of having a false positive. So for a 8 bit CRC, you have a 1/255 chance, etc.
However, the polynomial chosen also makes a big impact. The math is highly dependent on the data being transferred and isn't an easy answer.
You should evaluate more than just CRC depending on your communication mechanism (FEC with systems such a turbo codes is very useful and common).
To answer this question, you need to know the bit error rate of your channel which can only be determined empirically. And then once you have the measured BER, you have to decide what detection rate is "high" enough for your purposes.
Sending each message, for example, 5 times will give you pretty pretty good detection even on a very noisy channel, but it does crimp your throughput a bit. However, if you are sending commands to a deep-space-probe you may need that redundancy.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With