The following code results in an error (Playground)
#![feature(specialization)]
trait Foo {
type Assoc;
fn foo(&self) -> &Self::Assoc;
}
default impl<T> Foo for T {
type Assoc = T;
fn foo(&self) -> &Self::Assoc {
self
}
}
Error:
error[E0308]: mismatched types
--> src/main.rs:20:9
|
20 | self
| ^^^^ expected associated type, found type parameter
|
= note: expected type `&<T as Foo>::Assoc`
found type `&T`
This is strange since <T as Foo>::Assoc
is T
, so it should work. Stranger even: when I remove the default
keyword from the impl, it works (but of course, in my real code, I need to mark the impl as default
).
The same error happens when providing default values in the trait definition (Playground):
#![feature(specialization)]
#![feature(associated_type_defaults)]
trait Foo {
type Assoc = Self;
fn foo(&self) -> &Self::Assoc {
self
}
}
What's going on here? Is this a compiler bug? Or -- and that's why I'm asking this question -- does this error make sense because there is something special about specialization that I haven't understood yet? In case this is a bug, mem::transmute
is surely safe, riiiight?
The Specialization
feature is showing no signs of stabilising, mostly because of soundness concerns, so you should expect some problems.
You have this:
#![feature(specialization)]
trait Foo {
type Assoc;
fn foo(&self) -> &Self::Assoc;
}
default impl<T> Foo for T {
type Assoc = T;
fn foo(&self) -> &Self::Assoc {
self
}
}
But imagine that you added another implementation with its own associated type but without implementing foo
. This implementation's foo
will be "inherited" from the other, less specific, implementation:
impl<T: SomeConstraint> Foo for T {
type Assoc = NotT;
}
Then there'd be a problem. Your foo
would be returning a T
but, whenever T is SomeConstraint
there'd be a type mismatch because it should be returning a NotT
.
RFC 2532 — associated type defaults mentions a possible solution in its Future Work section. A hypothetical default
block could be used to indicate that associated type(s) and method(s) would need to be specialized together. There's no sign of when such a feature would be considered for inclusion, however.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With