So I've got this macro:
import language.experimental.macros import scala.reflect.macros.Context class Foo class Bar extends Foo { def launchMissiles = "launching" } object FooExample { def foo: Foo = macro foo_impl def foo_impl(c: Context): c.Expr[Foo] = c.Expr[Foo](c.universe.reify(new Bar).tree) }
I've said three times that I want foo
to return a Foo
, and yet I can do the following (in 2.10.0-RC3):
scala> FooExample.foo res0: Bar = Bar@4118f8dd scala> res0.launchMissiles res1: String = launching
The same thing happens if I remove the type parameters on either c.Expr
. If I really want to make sure that whoever's calling foo
can't see that they're getting a Bar
, I have to add a type ascription in the tree itself.
This is actually pretty great—it means for example that I can point a macro at a schema of some sort and create an anonymous subclass of some Vocabulary
class with member methods representing terms in the vocabulary, and these will be available on the returned object.
I'd like to understand exactly what I'm doing, though, so I have a couple of questions. First, what is the return type on the foo
method actually for? Is it just available for (optional) documentation? It clearly constrains the return type (e.g., I can't change it to Int
in this case), and if I remove it entirely I get an error like this:
scala> FooExample.foo <console>:8: error: type mismatch; found : Bar required: Nothing FooExample.foo ^
But I can change it to Any
and still get a statically typed Bar
when I call foo
.
Second, is this behavior specified somewhere? This seems like a fairly elementary set of issues, but I haven't been able to search up a clear explanation or discussion.
This behavior is underspecified but intended, though it might appear confusing. We plan to elaborate on the role of return type in macro signatures, but at the moment I feel like the flexibility is a good thing to have.
Also at times the behavior is inconsistent, e.g. when the macro is caught in the middle of type inference, its static signature will be used (i.e. Foo
in your example), not the type of the actual expansion. That's because macro expansion is intentionally delayed until type inference is done (so that macro implementations get to see inferred types, not type vars). This is a trade-off and not necessarily the best one, so we're planning to revisit it soon: https://issues.scala-lang.org/browse/SI-6755.
Another problem in this department is with implicit macros. When the return type of an implicit macro is generic and needs to be inferred from the requested type of an implicit value, bad things happen. This makes it currently impossible to use macros to generate type tags: https://issues.scala-lang.org/browse/SI-5923.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With