Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the minimum number of bits needed to correct all 2 bit errors?

I learned about hamming codes and how to use them to correct 1 bit errors and detect all 2 bit errors, but how extend this to correcting 2 bits, and maybe more?

What is the minimum number of bits needed to correct all 2 bit errors?

like image 899
user623879 Avatar asked Apr 12 '11 07:04

user623879


1 Answers

I think I figured it out.

N=number of data bits, k=number error correcting bits(eg parity for hamming)

In any ECC scheme, you have 2^(N+k) possible bit strings.

For single bit error:

You must find k such that the total number of possible bit strings is larger than the possible number of strings with at most 1 bit error for a given string.

The total possible strings with at most 1 bit error is 2^N(n+k+1)

1 string with no error, N+k strings with 1 bit error

2^(N+k)>=(2^N)*(N+k+1)

You simply have to plugin values of k until you find the one that satisfies the above(or however you wish to solve it)

Similarly for 2 bit error, it is

1 string with no error, N+k strings with 1 bit error, N+k choose 2 strings with 2 bit error.

2^(N+k)>=(2^N)*(N+k+1 + (N+k choose 2))

like image 104
user623879 Avatar answered Sep 29 '22 21:09

user623879