When I'm creating a sha256 sum with the sha256 crate, there is an old deprecated way digest_bytes (which works) and the deprecation also suggest to use the new digest function. When I use the new function as a drop in replacement, an error appears and the compiler suggest a solution which is very strange but works.
fn main() {
let bytes: Vec<u8> = std::fs::read("./Cargo.toml").unwrap();
let hash: String = sha256::digest_bytes(&bytes); // deprecated but works
// let hash: String = sha256::digest(&bytes); // does not work
// let hash: String = sha256::digest(&*bytes); // works, but why
println!("{}", hash);
}
Here the error when I use the sha256::digest(&bytes); line
--> src/main.rs:3:31
|
3 | let hash = sha256::digest(&bytes);
| -------------- ^^^^^^ the trait `Sha256Digest` is not implemented for `&Vec<u8>`
| |
| required by a bound introduced by this call
|
note: required by a bound in `sha256::digest`
--> /usr/local/cargo/registry/src/index.crates.io-6f17d22bba15001f/sha256-1.1.4/src/lib.rs:48:18
|
48 | pub fn digest<D: Sha256Digest>(input: D) -> String {
| ^^^^^^^^^^^^ required by this bound in `digest`
help: consider dereferencing here
|
3 | let hash = sha256::digest(&*bytes);
| +
Applying the suggested solution &* works then, but I don't understand why this works. Maybe someone can explain this to me?
A Vec can be dereferenced to get a slice. If bytes has the type Vec<u8>, then *bytes has the type [u8], and &*bytes has the type &[u8]. This behaviour is implemented in the Deref implementation for Vec<_>.
The digest_byte() function is declared like this:
pub fn digest_bytes(input: &[u8]) -> String
If you call this function as sha256::digest_bytes(&bytes), the compiler sees that you pass in an &Vec<u8>, while the function expects an &[u8]. According to Rust's coercion rules, the compiler will apply a deref coercion in this case, implicitly converting &Vec<u8> to &[u8].
However, the digest() function is declared like this:
pub fn digest<D: Sha256Digest>(input: D) -> String
If you call digest(&bytes), the compiler again sees that you pass in an &Vec<u8>, but the compiler does not know what target type you actually want. The declaration just states that you need some type D that implements the Sha256Digest. There could be many types implementing that trait. The compiler does not try applying arbitrary coercions in an attempt to end up with a type that actually satisfies the trait bound (and doing so would be a really bad idea). So for the generic version of this function, you have to manually dereference bytes to a &[u8], which is done using &*bytes.
The main difference between the two invocations is that in the first case, the compiler knows that the required type is &[u8], while in the second case, all the compiler knows is that the required type is some D that needs to be inferred by the compiler first. You can also tell the compiler what you want D to be; sha256::digest::<&[u8]>(&bytes) will work as well, since you told the compiler that &bytes needs to be converted to &[u8].
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With