I have the following code:
#include <iostream>
template<size_t N>
class A
{
};
template<int N, typename T> class B;
template<int N>
class B<N, A<N>>
{
};
int main()
{
B<3, A<3>> b;
return 0;
}
Here, B
is templated on an int
while A
is templated on size_t
, which is an unsigned long
with both compilers I am using.
When I use compiler 1 (current compiler), everything compiles and works the way I expect it to. When using compiler 2 (one we're moving to), I get a compiler error stating that there is no template specialization for B
that takes an unsigned long
- it has interpreted the 3
as an unsigned long
as it needs to be one for A
, but then can't find anything for B
. The fix is obvious, - just change B
to take a size_t
as well (or change A
to take an int
) - but I was wondering which is strictly correct by the standard. My gut feel is that it's compiler 2 (the one that's throwing the error).
From [temp.deduct.type]:
If
P
has a form that contains<i>
, and if the type ofi
differs from the type of the corresponding template parameter of the template named by the enclosing simple-template-id, deduction fails.
A<N>
for N
is an int
should fail deduction because the corresponding template parameter of A
is actually size_t
. This is a compiler #1 bug.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With