Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What are some of the best hashing algorithms to use for data integrity and deduplication?

I'm trying to hash a large number of files with binary data inside of them in order to: (1) check for corruption in the future, and (2) eliminate duplicate files (which might have completely different names and other metadata).

I know about md5 and sha1 and their relatives, but my understanding is that these are designed for security and therefore are deliberately slow in order to reduce the efficacy of brute force attacks. In contrast, I want algorithms that run as fast as possible, while reducing collisions as much as possible.

Any suggestions?

like image 498
King Spook Avatar asked Jul 27 '12 22:07

King Spook


People also ask

Which hashing algorithm is the best?

Probably the one most commonly used is SHA-256, which the National Institute of Standards and Technology (NIST) recommends using instead of MD5 or SHA-1. The SHA-256 algorithm returns hash value of 256-bits, or 64 hexadecimal digits.

Which hashing algorithm is recommended for the protection?

SHA-256 is one of the hashing algorithms that's part of the SHA-2 family (patented under a royalty-free U.S. patent 6829355). It's the most widely used and best hashing algorithm, often in conjunction with digital signatures, for: Authentication and encryption protocols, like TLS, SSL, SSH, and PGP.

What is the most commonly used hashing algorithm?

The MD5 algorithm, defined in RFC 1321, is probably the most well-known and widely used hash function.

Is hashing used for data integrity?

Hash values are also useful for verifying the integrity of data sent through insecure channels. The hash value of received data can be compared to the hash value of data as it was sent to determine whether the data was altered.


2 Answers

You are the most right. If your system does not have any adversary, using cryptographic hash-functions is overkill given their security properties.


Collisions depend on the number of bits, b, of your hash function and the number of hash values, N, you estimate to compute. Academic literature defends this collision probability must be bellow hardware error probability, so it is less likely to make a collision with a hash function than to be comparing data byte-by-byte [ref1,ref2,ref3,ref4,ref5]. Hardware error probability is in the range of 2^-12 and 2^-15 [ref6]. If you expect to generate N=2^q hash values then your collision probability may be given by this equation, which already takes into account the birthday paradox:
Equation

The number of bits of your hash function is directly proportional to its computational complexity. So you are interested in finding an hash function with the minimum bits possible, while being able to maintain collision probability at acceptable values.


Here's an example on how to make that analysis:

  • Let's say you have f=2^15 files;
  • The average size of each file lf is 2^20 bytes;
  • You pretend to divide each file into chunks of average size lc equal to 2^10 bytes;
  • Each file will be divided into c=lf/lc=2^10 chunks;

  • You will then hash q = f*c =2^25 objects.

From that equation the collision probability for several hash sizes is the following:

  • P(hash=64 bits) = 2^(2*25-64+1) = 2^-13 (lesser than 2^-12)
  • P(hash=128 bits) = 2^(2*25-128+1) 2^-77 (way much lesser than 2^-12)

Now you just need to decide which non-cryptographic hash function of 64 or 128 bits you will use, knowing 64 bits it pretty close to hardware error probability (but will be faster) and 128 bits is a much safer option (though slower).


Bellow you can find a small list removed from wikipedia of non-cryptographic hash functions. I know Murmurhash3 and it is much faster than any cryptographic hash function:

  1. Fowler–Noll–Vo : 32, 64, 128, 256, 512 and 1024 bits
  2. Jenkins : 64 and 128 bits
  3. MurmurHash : 32, 64, 128, and 160 bits
  4. CityHash : 64, 128 and 256 bits
like image 69
Leaurus Avatar answered Sep 23 '22 00:09

Leaurus


MD5 and SHA1 are not designed for security, no, so they are not particularly secure, and hence not really very slow, either. I've used MD5 for deduplication myself (with Python), and performance was just fine.

This article claims machines today can compute the MD5 hash of 330 MB of data per second.

SHA-1 was developed as a safer alternative to MD5 when it was discovered that you could craft inputs that would hash to the same value with MD5, but I think for your purposes MD5 will work fine. It certainly did for me.

like image 33
larsga Avatar answered Sep 23 '22 00:09

larsga