Below I define type variable, generic type alias, and a dot product function. mypy
doesn't raise an error. Why not?
I would expect it to raise an error for v3
because it's a vector of strings and I've specified that T
must be an int
, float
, or complex
.
from typing import Any, Iterable, Tuple, TypeVar
T = TypeVar('T', int, float, complex)
Vector = Iterable[T]
def dot_product(a: Vector[T], b: Vector[T]) -> T:
return sum(x * y for x, y in zip(a, b))
v1: Vector[int] = [] # same as Iterable[int], OK
v2: Vector[float] = [] # same as Iterable[float], OK
v3: Vector[str] = [] # no error - why not?
I think the problem here is that when you're constructing type alias, you're not actually constructing a new type -- you're just giving a nickname or alternate spelling to an existing one.
And if all you're doing is providing an alternative spelling to a type, that means that it it ought to be impossible to add any extra behavior while doing so. That's exactly what's happening here: you're trying to add additional information (your three type constraints) to Iterable, and mypy is ignoring them. There's a note saying basically this at the bottom of the mypy docs on generic type aliases.
The fact that mypy is just silently using your TypeVar without warning that its additional constraints are being ignored feels like a bug, actually. Specifically, it feels like a usability bug: Mypy ought to have raised a warning here and disallowed using anything other then unrestricted typevars inside your type alias.
So what can you do to type your code?
Well, one clean solution would be to not bother creating the Vector
type alias -- or create it, but not worry about constraining what it can be parameterized with.
This means a user can create a Vector[str]
(aka an Iterable[str]
), but that's really no big deal: they'll get a type error the moment they try actually passing it into any function like your dot_product
function that does use type aliases.
A second solution would be to create a custom vector
subclass. If you do so, you'd be creating a new type and so can actually add new constraints -- but you'd no longer be able to pass lists and such directly into your dot_product
classes: you'd need to wrap them in your custom Vector class.
This can be a little clunky, but you may end up drifting to this solution anyways: it gives you the opportunity to add custom methods to your new Vector class, which could perhaps help improve the overall readability of your code, depending on what exactly you're doing.
The third and final solution is to define a custom "Vector" Protocol. This would let us avoid having to wrap our lists in some custom class -- and we're creating a new type so we can add whatever constraints we want. For example:
from typing import Iterable, TypeVar, Iterator, List
from typing_extensions import Protocol
T = TypeVar('T', int, float, complex)
# Note: "class Vector(Protocol[T])" here means the same thing as
# "class Vector(Protocol, Generic[T])".
class Vector(Protocol[T]):
# Any object that implements these three methods with a compatible signature
# is considered to be compatible with "Vector".
def __iter__(self) -> Iterator[T]: ...
def __getitem__(self, idx: int) -> T: ...
def __setitem__(self, idx: int, val: T) -> None: ...
def dot_product(a: Vector[T], b: Vector[T]) -> T:
return sum(x * y for x, y in zip(a, b))
v1: Vector[int] = [] # OK: List[int] is compatible with Vector[int]
v2: Vector[float] = [] # OK: List[float] is compatible with Vector[int]
v3: Vector[str] = [] # Error: Value of type variable "T" of "Vector" cannot be "str"
dot_product(v3, v3) # Error: Value of type variable "T" of "dot_product" cannot be "str"
nums: List[int] = [1, 2, 3]
dot_product(nums, nums) # OK: List[int] is compatible with Vector[int]
The main disadvantage to this approach is that you can't really add any methods with actual logic to your protocol that you can reuse between anything that might be considered a "Vector". (Well, you sort of can, but not in any way that'll be useful in your example).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With